Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The dudes making self-driving trucks were the same ones reminding the teacher to…
ytc_UgyufiEiR…
G
Awesome teacher, voice clear easily to understand and sweet soft voice create in…
ytc_Ugzj-F_jZ…
G
The question of “What the goal of education is” is the bedrock issue of educatio…
ytc_UgxXHm3YQ…
G
AI needs to be categorized as a WMBD and controlled as such. Anyone engaging in …
ytc_UgwpCxRGn…
G
The thing is AI cannot see and uses RGB debugging so every wrong details is not …
ytc_UgwQwuBwb…
G
@wowJhil Yes that situation will only come right once people realise that the au…
ytr_UgyJn75KC…
G
I'm not sure I see the problem. First, unless you tell ChatGPT your name along w…
ytc_UgwmK4kq4…
G
No matter what people do its futile, AI is the future and real artist are going …
ytc_UgwKM6lrL…
Comment
Like, I understand why this is a bad thing, but all I'm hearing is all this is gonna do is hold us back from the technology we need for actual mood self driving cars. I wish people would get over themselves and understand that people will die. Trying to perfect the perfect self driving car. There were thousands of people that died on roads. Hit by cars, before we decided to put lines on the streets. And the more people that die in self driving vehicles, the better the self driving vehicles will get, until eventually.Their perfect that being said, I don't condone the people behind this at all. I'm just saying technology has downsides that you have to put up with to perfect the technology and it should be noted that it specifically says it was his own negligence, not the vehicle. He overrode the vehicle settings that makes him responsible, not the vehicle
youtube
AI Harm Incident
2025-08-15T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyRyYExbjBGN58LckR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1uaIjR8b2_oZMcVZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxagnHKB5-C3ZS0RJJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzTePAZAbXFU8yQZnZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySTtEHDTMFct28Y-N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw1KkYSh4ryH7NADE14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_1sFSGf7fv5HB4vB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgycT5jiUPvdsr4-dkh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwzrqJJ-VUowr7SuJ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9szE_xIODSoEP-uB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]