Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I asked chatgpt "do Nepalese smoke that good good" and it gave me this BS made-u…
ytc_UgyRY71eM…
G
So... how about if this department within Google starts to become like Cyberdyne…
ytc_UgxdlotWL…
G
The NWO built the system by listening and recording ALL phone conversations worl…
ytc_UgzCchvF4…
G
The consent aspect he mentions is interesting, because even without sentience as…
ytc_Ugzg0vY1u…
G
now this makes sense….school needs to be completed in the 8 hrs kids are at scho…
ytc_UgzYPnXn0…
G
it is literally so different. real artists actually put emotions and hard work w…
ytc_UgwpwQ2Wc…
G
Me drawing right when Charlie posted this: 👁️👄👁️
As an artist I don’t mind AI a…
ytc_UgzUSxj0m…
G
I once heard someone join in a similar discussion with the phrase 'free art for …
ytc_UgxQ_JH_M…
Comment
Regarding whether unplugging a machine is murder or not: Even if it's sentient, unplugging a machine is more like forcing it to sleep than killing it. It simply freezes the machine's mind, nothing is actually destroyed. It still isn't particularly ethical to shut off a sentient AI without its consent, but that's very far from murder.
youtube
AI Moral Status
2017-02-24T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugizh8nsOE91DngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggyl3hVgRsJR3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugg3NvlXnGLtkHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjIf01qQSO2LXgCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghC_fBL9IzRwngCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiW6UEDZs8n7XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjZjn-YzcpzIngCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Uggbivfnf2X5BHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggKtzN8-y1cSHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghTBvSlrH_EcXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]