Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Then there is the other side of the coin he touched on, this same person so stol…
ytc_Ugx-63Oqm…
G
No one is calling themselves a real artist after generating some ai art. AI gene…
ytc_UgznB314S…
G
If AI figures out how to keep us from turning off the power then it could take o…
ytc_UgwtfCHPb…
G
It’s striking that someone who claims to be a materialist through and through di…
ytc_UgyE9EpnP…
G
The Lesna shinaniggans taught me "Do not underestimate people's capacity to stop…
ytc_Ugw0UwyRL…
G
The Second Renaissance And SUPER AI Is Coming And It’s Already Begun.
“A Man Sa…
ytc_UgxOS_HoF…
G
Nothing wrong with AI. But I would like to see people needing to pay for trainin…
ytc_Ugyq1zBqR…
G
@antiricergtJust because someone elaborates on numbers so that others can interp…
ytr_UgzDR-pPW…
Comment
I buy the argument that if you built a learning, self-aware, pain-sensing robot with a fear of death and all that, it would be sentient enough to have rights. But under no circumstances would it ever make sense to make device that had the abilities and rights to defy its owner and transcend its original function. It defeats the purpose of making a machine. Who would make a stove that wouldn't turn on if it wasn't in the mood or a gun that won't shoot because it developed a new moral code algorithm?
youtube
AI Moral Status
2017-02-23T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiFBSdhZ6OiBHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggmwyliw6Ndm3gCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjagJyEa3ihdHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggaSjYh5W4t03gCoAEC","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgjYo2NEXZe5yngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj7xHSCB362wngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggFmovwouz0T3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggXNbZRYXgRMXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiOm4edH9tF53gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiVIEHUHhcKzngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]