Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This sounds like a sad death cry of someone who is unwilling to accept the end i…
ytc_Ugyr8XJFc…
G
My husband is constantly arguing with Chat GPT, because he just wants to get IT …
ytc_Ugyp1QSzp…
G
Here’s what Grok had to say:
“No, ChatGPT isn't an IRL Lovecraftian entity—it's …
ytc_UgxlIE7kw…
G
I would love to see AI come up with an efficient and practical way to harness th…
ytc_UgwV0zwZO…
G
I had an old laptop that pulled 50 watts max. I set up as a Bitcoin miner. And p…
rdc_oh3gkzx
G
We have also kinda programmed ourselves what to do in situations like this. We k…
ytc_UgyHUa4Mw…
G
guys I'm 24 minutes in and yr only talking about LLMs. why is superintelligence …
ytc_UgzGbjN8C…
G
Just take a took at insurance companies using ai to decline your medication, sur…
ytc_UgwOAb7dl…
Comment
There are a lot of humans online saying that humanity shouldn't exist, so you can't count on and AI to know that would be bad because not every human agrees.
And also an LLM made to act as a sentient super intelligence might turn out like all those dangerous fictional super intelligences in media that is in its database
youtube
AI Moral Status
2025-11-03T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxl-irZ24TQH6hWA-x4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2N4VSLpWYiaAU9Nl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTRDRI6ihW6Y7mXL14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw3hgWijWat3sIFkE14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz6RubrE5SGRB6CNSp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgywsXEBKXwBoOtxibl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwy4DLEMkskCHsxpHJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxz4cdA5FdLHOEzscN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugx29BBr5-nygEAtEH94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzeWfR6lzNkl1wAnbV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]