Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just limit hardware capability and we'll be fine. Or don't limit hardware capabi…
ytc_UggaSjYh5…
G
With the current state of human leadership, I'm leaning towards letting AI have …
ytc_UgyK_pef_…
G
This guy is extremely smart but we will still need lawyers let's say your Tony S…
ytc_UgztTPCcz…
G
They used Ai to single out families to make their lives miserable. Sounds like d…
ytc_Ugxqx4blj…
G
Ai could never be an art , cuz come on u don't even spend more then an hour on i…
ytc_Ugx3c9jAW…
G
Is it true that Standford University and students have open sourced a robot, -- …
ytc_UgwLHWpdO…
G
Wait. Full stop. The AI said it "would be a member of the One True Religion, the…
ytc_UgzZx_j56…
G
Anthropic was always about outrageous claims like by now 90% code should be writ…
ytc_UgyYlxsRB…
Comment
Robots can never really have the human definition of a "soul." Sure, they can be programmed to be LIKE a human, but that's in their code, meaning that the robot's personality is contained within its body, not anything similar to a human soul.
youtube
AI Moral Status
2017-02-23T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggpAKGkeH8npXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Uggl3gbXV3j3aHgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggfSdgUoWcfqXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugi8FKB45F8uXngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugj1URRd5Q4MT3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghDo6LsgPj8NXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjLCUq4NNmT53gCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiX0WMGRjD-fHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ughk7nwD9Bd9zHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Uggx0h3szehwfngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]