Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
After Veo 3 by google , now i start taking ai more seriously. Soon disney level …
ytc_UgzqG9QjQ…
G
For controversy number two, I believe that the AI is basically stealing from the…
ytc_UgwIPTQZ-…
G
There is no passion behind a.i art. Art isn't simply the finished product but th…
ytr_UgxE6P-6d…
G
If AI does the job better than a human, trying to fight against it, is futile. I…
ytc_Ugzil1alB…
G
I don't understand how people aren't able to work out that these aren't being bu…
ytc_UgxQwfhxt…
G
Seems like someone needs to make the AI think it's human, and to destroy humans …
ytc_UgwhsXU2g…
G
In other words AI doesn't need to type on a keyboard to communicate with another…
ytc_UgwOu2moN…
G
They told us computers were safe, and then when we all got hacked, they said oop…
ytc_UgypUAYa1…
Comment
It's an interesting argument that the LLM might not want to kill the earth because of the aggregate message from humans being that that is bad. But, what if the LLM puts out enough slop to the contrary that starts to guide its actions?
youtube
AI Moral Status
2025-11-05T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwONPTSxI16vLASrCx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyIUNV7HqoiN0D2SY94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw7zLXdI8VA8NExWy54AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwIqHqzQK3FRQ-Z9kd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxa8G6Hj7-Uy1v2m7F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxZeMbcoz8_B8cfC2B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwlIkV3gUvfeqpbTZt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxPiGyOdYmVGTx4S914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyg5llKGtiBwu0Oaj94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwIUaDRAUUrBlNLvdt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]