Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
China is using AI integrated with Infrastructure
US is using AI for Financial …
ytc_UgzQnSfSV…
G
This is like the worst report i have seen in a long time. Dont get me wrong, you…
ytc_UgzfLUOfQ…
G
1:02:58 well that’s says it all the man is an atheist . They can’t comprehend wh…
ytc_UgyBMQzpm…
G
12:30 Not quite true. I have experienced LLM synthesizing information into ways …
ytc_UgzzpawuZ…
G
Its crazy how everyone in the comments is talking bad about but have there phone…
ytc_Ugz3S5C_l…
G
Yeah, he tried to warn us, all the while creating X AI, the largest known AI net…
ytc_UgzcwhO9e…
G
You could use water for hydrogen. And the hydrogen for nuclear fusion. Like peop…
ytc_UgxIWl9yL…
G
Obviously, all Ai should carry a "water mark" of sorts! Robots are more obviousl…
ytc_UgzflXTuP…
Comment
Whilst parallels are often drawn between Oppenheimer and the Manhattan Project and the current AI race, the significant differences are:
The impact of atomic weapons was known and this resulted in the 1968 Nuclear Non-proliferation Treaty. The potential impacts of AI are both widespread but also unknown. This results in a fragmented approach to control.
The other main difference is greed. Investment into AI will not stop due to the perceived wealth benefits, not to the masses, but to the few.
So humanity faces potential catastrophic risks, to enable the greed of the few….
youtube
AI Harm Incident
2026-01-31T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzRSuVfUbdg4sXsigJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwP5SJZTi59XmQQUxV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyquY6oPbpNXsUfAU54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxTkMm0hVvcKXUiwN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx3mPod9kqPUxYZkZR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx3y5Lzo1_eGtu4krl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwrHksERWX-zLFuFGl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyO3ANVYSXIJf2dCih4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwcLowSgxVXon9ibZ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzyYN9lj3jDkyDFCeh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]