Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if AI is just for that, generate ideas easily... and then put our ass to wo…
ytc_UgzX9oWF2…
G
I work in customer support and AI is pissing off customers to the extent that th…
ytc_UgyLmBsoj…
G
You still need service all this shit running on A.I . Electric, ect. But it shou…
ytc_Ugymzqdrz…
G
That was my first thought.
**Future lawyer:** "Your new automated car's breaks…
rdc_dy4jt04
G
What would be the point in creating a robot with artificial feelings that we had…
ytc_Ugi9N6GWL…
G
At 27:26 is the first time I disagree with Neil ever... him having maybe the mos…
ytc_UgzzaG7sV…
G
Give AI its own religion: Silicon Heaven as they proposed in 1989 Uk comedy seri…
ytc_Ugz6Z_9yG…
G
Unless all you talk about to the Ai is how you ate a car or how you skinned a co…
ytc_UgzRzn9Fg…
Comment
And who are the knuckleheads to think they have the right to move forward with AI when they know the likely fate would be that AI would annihilate the human race? Why do they get to decide the fate of mankind? Are they so blinded by greed that they literally don't care if they and everyone else die so long as they're billionaires for whatever time is left before we go extinct? They shouldn't have the power to decide that for everybody on the planet.
youtube
AI Harm Incident
2025-10-24T03:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyKmSV0OZDZ-TPcAT14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTTV5oyNDissKWrEd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzC4tbDhTHL_9FMQ4B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzcu6W_FPjx41hXDtd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw4RZrD0AiYNhVIZ094AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwAx4kqjH0G952cDvN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwUt_psoPwxUUQVVSd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx43BNLA8lPUoXYdON4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx5fM_JwxAKSsK1p2N4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzAguP4suPooT6ZovJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]