Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm surprised people don't want AI. They make your job too easy, what more do yo…
ytc_UgzwUSnUb…
G
Same for me when i know it's ai it just loses all the spark it becomes nothing…
ytc_Ugx1vJhON…
G
Great video, and a lot to think about when it comes to self driving cars. But th…
ytc_UggzSEiGs…
G
Every industry is evolving or degrading with this technology. Everyone knows the…
ytc_Ugxf24SHz…
G
Because there’s constantly new channels of this AI content being created daily o…
ytr_UgxKN7Kro…
G
AI is a problem looking for a problem while poseing as a solution to an unknown …
ytc_Ugz6OOdMf…
G
If statement with so many possible answer. Just add a lot of scenarios into one …
ytc_Ugx8coPjG…
G
We don’t need bots to be human FOR us! There’s inherent meaning and value in the…
ytc_UgwqifmL2…
Comment
16% of AI researcher belive AI could result in a human extinction dies not = a 16% chance that it will. It means a small amount of people think its possible. Its like if 16% of scientists came out and said its possible the world is flat, it doesnt mean there's a 16% chance its flat.
youtube
AI Moral Status
2025-12-14T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxXeOrI671Yi7i7Ft54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgybGHc54odzRpa1FDl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8pI1y-QGyh_KYf4l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyKR9tORfqD-31Saq54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxBAIGNs_g3XVNIxyZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx0D94-P5Nl8D7Hr1t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtOqEKR0Y7CGwVVFx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwy1uQ2a4H-llKYQul4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgycqYoaVGUmUid8Lq94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0eD_wpQXJ5998UQx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]