Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
6. Voting works fine if you confirm Identity when you enter your vote. The fact …
rdc_llf7rmy
G
They practicing AI in war fare in the ukraine 🇺🇦 war also , It killed foreign ai…
ytc_UgxuAsvod…
G
The video is about side effects of deploying AI ..but the video itself is made u…
ytc_Ugz0AxXvU…
G
My preliminary research shows; a decent laptop running llama 3B can replace 98% …
ytc_Ugz8ZHHts…
G
The left Waymo noticed the other was a self driving car and it knows they drive …
ytc_UgwOp-I78…
G
I'll never get tired of folks ranting about AI. As an artist that had their art …
ytc_UgxeKM_zK…
G
No one ask me why im saying to the Oppenheimer chat bot "how do i make a nuke"…
ytc_UgxgePaPY…
G
Old folks in Congress letting our society be ruined by technocrats really worked…
rdc_mva4xct
Comment
So wait, the guy disabled the automated system because it 'wasn't driving fast enough for his tastes' (i.e. not speeding), then sues that the disabled system was, in fact, disabled? This crash is the result of pure user error. I doubt complacency featured much in the accident, given the individual's proven track record of unsafe driving, having cited many times where the vehicle locked the user out for irresponsible behavior.
Sure, 'autopilot' is not an appropriate term for the feature, because it absolutely isn't. And if that leads to a general lack of proper driving discipline, then that's a problem that needs to be addressed. However, he overrode the safety systems put into place, and actively hit the accelerator manually. That's on him.
youtube
AI Harm Incident
2025-08-17T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUvURAXPpt_LnbY6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9mnxS1OupQtXq9bF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxtiWOqFpInUS9L3PB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxV9xQvtQpFpyioxOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxt374a4jhPLUocwwp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzOQxNYoBpW0ClLqgF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwocRcvg5U9DkT4FK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgynGXyWljYnCbeX9EN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgybR10bTgx_lzaRWhV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugy7L8oGz1H-x8rtS9R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]