Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe try Claude. I have never taken any courses in programming, I don't know a…
ytc_UgykTs3Mc…
G
AI told me that AI robotics will be able to create AI robotics, repair AI roboti…
ytc_UgwwW52oq…
G
F*ck all this bullsh*t talk about "AI making our lives easier" ... life is suppo…
ytc_UgxHy080s…
G
No either a human has to program the robot to stimulate human emotions or a viru…
ytc_UgwK5pIhN…
G
There's so many things said here that are just a complete misunderstanding of AI…
ytc_UgzVIinh1…
G
1. Stop projecting. You know many artists tried to help truckers, right? Also, t…
ytr_UgzgOhYPc…
G
The artist a lot of the time is the art director, among a thousand other things,…
ytr_UgwjElhWm…
G
Yeah, the main question is what applications exist where that error rate is acce…
ytr_Ugwr180V7…
Comment
Jaywalking with dark cloths at night is dumb, however, the point of this technology is so cars can see what humans cant. It should be BETTER than human, hence its development. I'm a huge proponent of self driving cars, but it needs more work. Hopefully this type of accident will be resolved when it goes mainstream.
youtube
AI Harm Incident
2018-03-22T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzxHy7uvJZhpcmxlll4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyRLMUoUAjClat7CG94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgyL8WBj8DHdQk_Igr54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx_fxnOM5I7wYFz85l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxlunewTn_ITmIHzPN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzfw1Jw6CnMajwDjZB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgweCGqcYlfeYk4zsgZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgypaD3yhSCq87jelDp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwbg15AM1VjNdzADvx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy59dP3dxKmQBJahuF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]