Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My chatgpt choose 27 and say him I will not talk for 27 days so he said wow 27 d…
ytc_UgyiFjP3Z…
G
Autopilot is not full self driving, I think the terms are getting mixed up a lot…
ytc_UgxF9C444…
G
Thanks for sharing your disagreement! It's important I clarify two things:
1. P…
ytr_Ugzp3auak…
G
Thanks Geoffrey, however your theory utterly collapses when examined through the…
ytc_UgyBVV-hq…
G
I started drawing frequently at 12 and I was genuinely shit at it until I turned…
ytc_UgzW2VXs1…
G
I was watching a movie where this guy was talking to his wife over the computer.…
ytc_UgxePU0w-…
G
Factually incorrect...
More sensors to barely provide same quality, 6 times more…
ytr_UgzVLE3uL…
G
Social media,AI, and greed will end the united states. Civilizations and countri…
ytc_UgwsLRrCx…
Comment
I think for the most part, it works. But like most technology, there are situations where it doesn’t work. Now, some have consequences, some don’t. The real question is: this is not a game, these are lives at risk A fully self driving car cannot tolerate even a 1% failure rate as the consequences are much higher. The consumer has to make that call, everyone is an adult. But go in eyes wide open. I’ll put it this way: quite a few engineers I know are skeptical; they may own the cars, but FSD is almost always not used or supervised.
youtube
AI Harm Incident
2025-10-19T15:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugw5fRClR-ryDDOhnL54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"sympathy"},
{"id":"ytc_UgwUWH9x11OS_amyFJd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3qwMWNhqbH1iXRI94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxBeQaSsGYNjVwdeLp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGeW7t4y54A3UzRpR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugw7TnXIfPG_NBlTKEJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxzhZulEsTdSChaZk94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy79rTOnsI1YeIy0hx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxDAOw6lIoDYI0JiaJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx8tVUvR1oXAb5Hz-14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]