Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We will known in 20 years time when we have all the highways parked up with fo…
ytc_UgzKgXAkL…
G
What makes anyone think that super intelligent AI will want to work? I think the…
ytc_Ugya8Z2b7…
G
Me literally on the AI five minutes ago when
(I never gave it my name and…
ytc_Ugx6C1Zuk…
G
Sure there are AI tools that do useful things, but these people (who worship AI)…
ytc_UgzjBghhW…
G
I'm gonna say this in the most respectful way possible: Fuck AI art and everythi…
ytc_Ugx3K9iE6…
G
So if there was a 90% chance that AI would kill all humans in 10 years, focusing…
ytc_UgzLNVQwM…
G
I only use AI when I'm in a hurry and I just need a darn cover, like my pfp, I h…
ytc_UgzUi91E0…
G
The girl said the streaming world was shaken to its core, that's peak. I don't w…
ytc_Ugy9kN6Rd…
Comment
Sometimes a solution does not need to be as complicated as it seems. Just have laws that require all AI server farms to have emergency electrical cut off switches. If any harm is being caused, or anything is getting out of hand, you just turn off the electricity, and they are dead...
youtube
AI Harm Incident
2025-09-12T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxTXe07zAKkEj8GdBd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxIRiA4eeyyiGAxwHx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxS_cmtd7TdT-8OQBt4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwJlmwedxrOOph562p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwsGeU4hJnW-CGHBWV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw1rfX6HzTnYG57Rqh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwdj23bI2WAZ-IG6Bd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLQ9euy04xSnXPbld4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxm_1X1kdtwsByrKT14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxSAbHfVGuChAeTHJd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]