Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can we automate a few mcdonalds first and let those workers try art instead. Why…
ytc_UgyptcjPJ…
G
From Gemini ( paid)
Conclusion
The video contains elements of fear-mongering. I…
ytc_UgwK9CIXS…
G
It seems like Ai Bros dont understand that the process of making art is so satis…
ytc_UgzF4qbKs…
G
AI of course could ace any bar or board exams. they have access to the internet…
ytc_UgxIN-itG…
G
Yes, it’s for legal reasons. But the reason Waymo doesn’t have a person anymore …
ytr_UgxdwpNMJ…
G
sora aí mean that when you put a word are whatever you want it will put a image…
ytc_UgzNe6fcV…
G
Ai art is lazy as hell. My class had an assignment which we need to draw a short…
ytc_UgzCiFuKC…
G
This robot could help around your home so as you all have more time to be togeth…
ytc_Ugxjlz4h7…
Comment
2:25 The Tesla was on AUTOPILOT not FULL SELF DRIVING. Tesla’s autopilot is meant for highways and straight roads. NOT INTERSECTIONS. Not only this, but you are supposed to be fully attentive at all times when using these programs. That is made abundantly clear when signing up for and using these programs. Autopilot did exactly as advertised, however I will admit that the driver safety system should have taken over and stopped the vehicle. So not autopilot, and completely the driver’s fault, but the safety system did not work as intended. This should be a lawsuit against the safety system, not Autopilot. Not to even mention the fact that there are millions more human crashes per 100,000 miles than self-driving cars. You would not sue the car manufacturer if you were on say, adaptive cruise control, which is fundamentally the same thing (except for the steering obviously). The plaintiff is in the wrong!
youtube
AI Harm Incident
2025-08-15T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyRyYExbjBGN58LckR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1uaIjR8b2_oZMcVZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxagnHKB5-C3ZS0RJJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzTePAZAbXFU8yQZnZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySTtEHDTMFct28Y-N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw1KkYSh4ryH7NADE14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_1sFSGf7fv5HB4vB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgycT5jiUPvdsr4-dkh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwzrqJJ-VUowr7SuJ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9szE_xIODSoEP-uB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]