Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yes, we should all hate Musk for a lot of very good reasons ... but take close note here of what Legal Eagle DID NOT tell you, specifically to magnify hate rather than to seek out the truth and better justice. If someone uses a tool in ways the manufacturer does not intend, or forbids, it usually isn't or shouldn't be the manufacturer's fault. Should Ford or GM be sued every time there's an accident due to a driver intentionally speeding, because they didn't prevent the car from being able to speed against posted speed limits - something that is totally possible for any modern car? Tesla has always made it clear that autopilot (and FSD) users are ultimately responsible and must maintain close supervision, regardless of any claims of what the software can do. This driver intentionally violated this. He was also violating the law not just by speeding but by just handling his phone while driving, let alone trying to recover it after dropping it. Also FFS. this video is brutal in how it conflates Autopilot and Full Self Driving, they are not the same thing. They do not have the same abilities. The claims of what FSD can dont apply to Autopilot. It's a shame that all the instances where accidents truly have been avoided, where lives have been saved by the software go totally un-noticed, because nothing happened. There are a lot of idiots who think self driving vehicles must be perfect before they can be accepted, as if all the people who will die from an imperfect but better-than-human system not being available, just dont matter. ... And after stuffing all that misrepresentation into the video, in a case where likely it was just an ignorant jury that didn't like Musk ruling based on their feelings... Legal Eagle then tries to sell its services to us, in case we need a great lawyer. Why would I go to you when I can't be sure if you were intentionally biased in this video to feed on our hate from Musk, which is bad enough, or if you were actually too incompetent to recognize the problems I mentioned.
youtube AI Harm Incident 2025-08-15T21:0…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugwq6KiREqPGHazGBmF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugybk-OA1vLU4UcGCbl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzIGBdXNCotCf2F6M54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxYYpk9W-uEudo72_l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxenDH_L2vxATkiBPl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw5UbtFLRYhFPr9fIZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgygLXahYGjAZNrP4ed4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwEA79KnB9Q2VyJ3AB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugx1oH75K5SbUzene4J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxExnuQ80F8HWrRPlR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]