Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the jist of this video ...what little i was able to stomach ... seems to be that…
ytc_UgwE_zgCF…
G
Yup. I did this months ago. I couldnt find a good psychologist who i considered …
ytc_UgwaRM6lE…
G
Wtf AI making the call for all our lives ??? Well, I'm hoping ai will decide to …
ytc_UgxWSCEFI…
G
They are not "AI artist" but rather AI prompters. They don't create art, they st…
ytc_UgyKDKyXG…
G
Is that much scared !!! But when I think that until now Quantum AI that I named …
ytc_Ugz23-sKN…
G
No. Ai companies will go broke cuz nobody wants to pay robots. If the people pre…
ytr_UgzkcbqJm…
G
If it's used in diagnosis, what happens when the doctor and the AI disagree? The…
rdc_f1el8z5
G
9:10 I have to disagree with you on "tracing this is what AI is doing". There ar…
ytc_Ugwr3sZXu…
Comment
Yes, US legal system please work for once 🙏 I rode in a self driving Tesla one time. It is NOT smooth in the way it turns, it basically stutters as it turns and it tried so hard to avoid a small piece of trash that was in the road while parallel parking bc it didn't know it was (a human would have just run it over instead of trying to avoid it). The weird jerky stuttering turns are simply jarring and uncomfortable. Imagine if the Tesla had made a dangerous maneuver to avoid a piece of paper in the road? Even in this one relatively uneventful ride autopilot is clearly not a substitute for a human driver.
youtube
AI Harm Incident
2025-08-17T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYfFMHVzVyuhQFdMt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxaBDG77cwlFTS-1Ox4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzUuC0NA5yU6OFDmF94AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx8sjZXm1A1ydsIgh54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHwWjz_9Kh5zFrLNp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxBScCp17c0Czvh5fR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzAHrislxZ1paCuwUB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy06hydIuwlfoY6EOB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxF9C444nMrHGdc5g14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_dtugAD-9e439fwl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"resignation"}
]