Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is so freakish, I am not comfortable with this, I like robots but I don't l…
ytc_UgxvBwcEp…
G
This is why we can't have nice things. No wonder AI has gotten so fucking useles…
rdc_mk77gtt
G
Ai doesn't exist.
"I'm not against tech...."
All these people are their own wor…
ytc_UgzatnAte…
G
Uhuh, so would you HONESTLY prefer seeing a Jackson Pollock "painting" rather th…
ytc_UgwVMP__S…
G
I made it 13 minutes, but got tired. The flurry of AI slop is only a side effect…
ytc_UgzNB8luo…
G
The video hits the nail on the head about the "training gap." Companies are dump…
ytc_UgzlAPxOZ…
G
how dumb, robots or AI will never have feelings. Best case scenario is for a pro…
ytc_Ugjz82-iE…
G
There is really a very narrow path we have to go down that will likely result in…
ytc_Ugxy2satT…
Comment
The government knows that the people will go against them and thats the real reason they want robots and AI. I also think that Elon Musk is trying to build a city in the sky by testing starlink just like in that movie Elysium!
youtube
AI Harm Incident
2025-01-11T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwpfqzSq4RjWpUbb9h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyeYVQ4wJvHfIRk2tZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPNODWwXxFlkJCQPB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwWM_Rc03i4OdgBblp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz66qEYW9XwBCslyiF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy31sFPoYoDPDaA2ht4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzcv8JcZ0mnfB9Mpet4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwX5-4tFms73kCxDkt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyaX9s7UoExz-NA4Ah4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgztJtq5odK4YpP0l2R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]