Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My question is why do we suddenly need AI so badly? The talk about risks is what…
ytc_UgyP32EFA…
G
All power stations would need to be completely ringfenced from networks, AI and …
ytc_Ugx5LHkot…
G
They still can't make them move like humans in 2025. I knew right away this thin…
ytc_UgyltYsLz…
G
Is this why the automatic sink in public restrooms barely work for me because it…
ytc_UgyoEw6oK…
G
I’ve been following AI since IBM Watson, and even though I freaking loved ChatGP…
ytc_Ugx4XaegJ…
G
Let AI work for us and let's live a life dedicated to human connection. This cou…
ytc_UgylkwL8X…
G
Yeah, if you make music with the A.I. apps, and post your creation on youtube, t…
ytc_Ugwfb0X3C…
G
The creativity is the reason why ai probably will not take like all jobs in IT s…
ytc_UgxehTyw3…
Comment
In Simon R. Green's Deathstalker series, the Enemies of Humanity turn out to have been created by humanity itself, and AI is no different but it will probably destroy humanity through over-consumption of natural resources, worsen climate change, create and spread mis-information and propaganda and encourage humanity as a whole to dumb itself down by using it do every task that can and should be done by people.
youtube
AI Moral Status
2026-02-12T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgziCu7la66Tpq-dLvN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxDVFHRxQSEiieHFSh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgythoQZ8G2XJqQKI9p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwt3-XUR-1JdvfvAJt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxbarWoKwoxJ97XAMN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5Ltesa6yIbqGxUF14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzIZEoDxWzBxYngrsF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwtAhQ04h4Ov-rZIud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugymo4woJ3eTuzznHMB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxBTOneTaGa9_lPuuJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]