Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
1:20 As a Tesla owner myself… I will say this, I use self driving EVERY single day… in San Diego… some of the largest & busiest freeways … and it 1000% drives much safer than I do… however I’m not an idiot that depends on a car to keep me alive, along with following the instructions of the car such as…when I press on the accelerator(because it’s not going fast enough… ) I get a warning that states “car will not auto break if accelerating manually” now if you think about this logically, this would make sense. Us as humans wouldn’t press “go” & stop at the same time, correct? So why would we expect the car to stop if the “human” is telling the machine go? That in itself could pose dangers which is why they warn the driver that auto break will not operate in that circumstance…. I’m sorry but I am SO SICK of people blaming Elon for his inventions when we as the operators need to take accountability for unrealistic expectations and poor judgment/driving. Dont get me wrong, he’s far from perfect, he may over promise on performance but it’s up to us as the operator of these items to read the fine print and take safety measures to ensure our own lives aren’t at risk.
youtube AI Harm Incident 2025-08-19T00:3… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxDvF4LaK3efIQAbDZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz7wZSiy6jrkNAfWN94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx6FOs0BxYdvmfP5854AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwvFaeO3VFKxKeLmHV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxBMLfoGXt3bhZa2pB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyf9NbHDOySAzkNsG94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwOU0KPVHGNgSOVUlJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwz3zozfzA-jTUK8rF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw9XV88eNz5jkZTCCJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwziCpzMTDqs-vWIkp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]