Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Tesla's excuses are BS and they know it, the technology to limit access to self-driving or "autopilot" features to inattentive drivers and in un-ideal circumstances already exist. I drove to GenCon in a rental 2024 Ford Expedition. It had self-driving capabilities very similar to the ones described in this case (staying in lane, maintaining speed, auto-braking when it detects a vehicle ahead within a certain proximity), but those features were not available (they auto-deactivated and flashed a warning on the dash) when driving more than 10 mph over the speed limit or when I drove below a certain speed where it was presumed I'd need to brake or otherwise react frequently to road conditions (aka anywhere other than a highway), and when I made myself "distracted" (turning my face away from the road but still with it in my peripheral vision on a straight highway with no nearby vehicles, just to see what happened) the car chimed at me similar to what's shown in this video at 13:40 to warn me of my inattention, I don't know what would've happened if I'd remained inattentive because I always resumed normal control after two or three chimes. I can only assume that Tesla engineers were ordered to cut corners to save costs with the presumption that Tesla lawyers could throw the drivers themselves under their own cybercars to avoid responsibility.
youtube AI Harm Incident 2025-08-17T22:1… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwYfFMHVzVyuhQFdMt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxaBDG77cwlFTS-1Ox4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzUuC0NA5yU6OFDmF94AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugx8sjZXm1A1ydsIgh54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxHwWjz_9Kh5zFrLNp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxBScCp17c0Czvh5fR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzAHrislxZ1paCuwUB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy06hydIuwlfoY6EOB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxF9C444nMrHGdc5g14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx_dtugAD-9e439fwl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"resignation"} ]