Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A very refined gentleman talking rationally of his concerns about AI. Not quite …
ytc_Ugwc7MG8E…
G
Another controversial topic about AI that also has an economic concern is AI rel…
ytc_UgxuKOSEU…
G
AI and AGI are the biggest existential threats to humanity. They are going to de…
ytc_UgzuGdJJ7…
G
AI will replace the human. When the governments of the world do a way with 3/4 o…
ytc_UgwhD8zVj…
G
You people are beyond ridiculous
We don't need robots
God is the creator
robo…
ytc_Ugzcbh5JZ…
G
The world is evolving, eventually electronic cars with self driving are going to…
ytr_Ugxhn5JRr…
G
@lepidoptera9337 Holy shit u are totally right. Last thing we need is a religiou…
ytr_Ugx0nr9EY…
G
Where Altman chases AGI as a synthetic mind, Google builds systems that behave i…
ytc_Ugy8NEeW-…
Comment
I studied environmental science and biology in college, got my bachelor’s degree 15 years ago, and AI was a huge concern back then but I think most people also understood it to be an inevitability. AI super intelligence will happen. There was never any possibility that AI could be ethical or not end in disaster. We have known this for decades.
youtube
AI Moral Status
2026-01-19T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwjlhdI2LWvj6RhL4Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKw9NAV-gNDhLAFC94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"sadness"},
{"id":"ytc_UgyxBB5eEH9P8nJ0Vzd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy8UdJmuij01_8bkiR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtVQKOuLGgBZjXh5R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugwva8TARf-Aq3w8yyt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxirb56S_SuD5Y9TTt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz-Tvo2kfeTEFfMBN94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLZs2e7VAGcZxCE7F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyihtTy5A5QBPNfItV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]