Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah well, nobody during the opioid crisis also was shoving oxycontin into the p…
ytc_UgzC3ve21…
G
AI is good if the intention is to help everybody. But the problem is human natur…
ytc_Ugy8AwMQb…
G
Our “bad” art. OUR “BAD” ART. I’d like you to know that at least our art has a s…
ytr_Ugyq_A96d…
G
I get my AI to make a garden and farm for me so I don't have to worry about food…
ytc_Ugye7Uuen…
G
I asked ChatGPT if it could create it’s own GNU/Linux distro from the existing o…
ytc_UgySwG54g…
G
Lol I thought ai art is the ultimate huge insult to our great creation that noth…
ytc_Ugxg2OrJY…
G
Agreed - even if AI improves a lot, we'll always need people who understand how …
ytc_Ugxtqn3Nk…
G
I am happy that I have lived through the golden age of the years 1980 untill 200…
ytc_Ugz0D7BKw…
Comment
All of this is based on the Internet. Either we shut down the Internet, or we keep AI. We can't pick and choose. Maybe it's the Mark of the Beast 🤷. I don't know but, why are we all worrying about something we ourselves are creating? It's like we all want to die.
youtube
AI Moral Status
2025-12-14T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxXeOrI671Yi7i7Ft54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgybGHc54odzRpa1FDl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8pI1y-QGyh_KYf4l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyKR9tORfqD-31Saq54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxBAIGNs_g3XVNIxyZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx0D94-P5Nl8D7Hr1t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtOqEKR0Y7CGwVVFx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwy1uQ2a4H-llKYQul4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgycqYoaVGUmUid8Lq94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0eD_wpQXJ5998UQx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]