Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It really doesn't matter what you call it. A true auto pilot doesn't exist yet and drivers of Tesla who treat it like this are being willfully negligent. I'm not against tesla or auto pilot either, but this isn't the first or last instance of teslas running into stuff. They also have legal disclaimers saying to pay attention, and a system that monitors if you're actually paying attention. If you disregard all those things then it's like going over the zoo fencing to get a closer look at the animal pen. Not everyone will fall in, but there's plenty of warnings for you to disregard, and plenty of examples of it going the wrong way; just ask Harambe. The drivers should get hit with manslaughter as if they were texting, or under impairment. Instead of treating it like an experimental program of which each owner/driver can provide valuable input, they treat it as if it's a luxury feature of an automated private chauffeur. Also, I'm not putting all tesla owners on blast because many of them are responsible. As for motorcycle riders, stay away from the tesla on your 6, because you could be unwillingly playing Russian roulette. We should all know too well how our space and safety gets violated all too easily by distracted drivers. This is just one more hazard to look out for.
youtube AI Harm Incident 2022-09-13T20:3…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyliability
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgytcoWOg5d6TC-CNGt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxWnEcFPJMWPFUQN_B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzTKozoibJokBH4_tt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxsyJYcuDm7zA6vb1R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxsr_WuaGBN7HnXu3t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy_MQp3jYBYVbNPF6R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw-iMEgtihCVor1JzB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwMMR92JlZ2lsCNGEp4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugwwq2VRDWUGRT725fR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwDysAq0O8qrRhJoqV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]