Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is bad news. Stop development of AI now. I believe the Bible speaks on this…
ytc_UgzvgLR3A…
G
AI will replace shit coders, shit lawyers and shit writers. Be good and you are …
ytc_UgyrR5S9a…
G
Ai is being inspired in a similar way as humans are. There is not really a diffe…
ytc_UgyLew8wo…
G
A human can't connect themselves to Any network and download relevant data spont…
ytc_Ugx5DCHbI…
G
We humans need authentic humans with genuine love and compassion. Condolences an…
ytc_UgwJj_pYq…
G
That's an interesting take! In the video, Sophia explains that her name actually…
ytr_UgwP5p6n9…
G
Dude was using ai to write in the first place, they cut out the middleman 😂…
ytc_UgyqPmWaS…
G
That robot must have took notes from me. I really gotta stop sharing my fight st…
ytc_UgzX6EHtp…
Comment
Even with the absence of pain, a sentient robot would still prefer not to be damaged I would think, as this would impair its ability to function properly. Also, if a human being couldn't experience physical or emotional pain, its rights wouldn't be forfeit, so it would be a double standard not to extend this same right to sentient robots.
youtube
AI Moral Status
2017-08-08T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzdc5ggMEKwzkcZ07V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1YurAqjrGaCWv-MV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuhpMbmFcwc1EwNSt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugwo7z4sOrI-LDRBRTN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxy1uFJLiB-VO4r-Fp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzUoGaTpCJ_sGD8lHB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzGQ2081wUKl_7ojaV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzx0n3rUBZhenf_JPh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugja24tjkz6vPHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxkgXw-9xAY0JKPR2x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]