Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Elon defence bots in full force in replies, literally in some cases likely using Grok (yeah mate real people use automobile there). Some truly pathetic actions, either being bootlicker, too devoted employee or a Clanker. And from what's come out even Tesla's assist feature is trash let alone the thing they call self driving for ads and not such when in court. Assist feature when you actually put proper effort into has been fairly effective in other cars. Like true self driving cars is pretty possible but you do have to give it enough cameras it's got better vision than a driver, ways to communicate to other cars better than what humans use of traffic lights, car lights and hand gestures and can still identify crossings even if quite faded...and also ironically enough can't work in as car centric society as the US you need good cities for them and public transport that keeps roads less clogged and less people likely needing to cross since more cars and especially more people always equals more risk. Claiming at this stage while pretty clearly trying to cut costs that its better than human drivers was just begging for them to take some responsibility in an accident, actually a place you have incentive to downplay how good it is.
youtube AI Harm Incident 2025-08-16T14:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxpK2VppfoEBZr8fId4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyoZhy342v9em11HWp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxG3E_a9dEZ-CaYXPt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxOldRalbDSWPh9TZt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyr796iOPkskHZzFud4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxXrlZuSX_UI3euodl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzPfTV_FExZZehM9ZV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxBAhbcTAQ6JGSAyuJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgzDFwG20A9EOrs2lp14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwGMlbj3wEu4AEVqEN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]