Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You're kidding me. So the guy steps on the gas and holds it down, and its somehow the self driving software's fault that it didn't force him to stop? If you take control of the vehicle you're responsible for its behavior. "Autopilot is misleading" so is saying this man wasn't responsible for his own death. The biggest issue here is that people VASTLY overestimate the capability of most human drivers and then expect self driving cars to be perfect (beyond the already elevated expectations). The autopilot software is already superior to human drivers, not just because it outright drives better but also because it is much more attentive and doesn't have to turn its head to see in all directions at once. Its got a lot more advantages than that besides, I'm just not going to list them all here right now. Accidents happen all the time. Go look up how many people die because of vehicle design and manufacturing flaws. Its a lot more than Tesla and we should not stop pushing for this self driving technology just because of a bunch of haters don't like Tesla or Elon. I'm not saying deaths are ok, but I am saying have some perspective and not such obvious bias.
youtube AI Harm Incident 2025-08-16T02:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyTp4bS-FxEYBWa_2R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugy8lbz2-ZDkN6IZbG14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzPvbGYo29-rcR8b1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyR5B9KXHgr2nBlMMB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxZgShccTacBLeLF3x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytc_Ugw5GL5gHFEFix5GnUR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgxeZVWqD4x5xANjm8p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyBrGwuGA7xpoPWgBB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzoCWixuWHaQ8HgI9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw6mxPBMtanB5Is5hJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"} ]