Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I have been driving a Tesla Model Y with AutoPilot and a subscription to FSD on and off for a few years now. I use AutoPilot most of the time that I am driving on the local country roads around my house and usually subscribe to FSD if I am taking any long trips. If this driver used AP for longer than a day or 2 he should have known that AP does NOT stop for stop signs or lights and you have to watch it at all times since it does perform like a drunk new teenager driver at times. Only FSD does this follows road signs and stop light directions. AP works great most of the time (It really is more a lane centering and adaptive cruise program and not AutoPilot like the name suggests) but you have to watch it and do all the stopping for lights and stop signs and such. Bending down to search for your phone and letting AP do everything for you was a very very dumb move of the account of this driver. Not that Tesla is blameless but I would put most of this on the driver. In this video you keep showing FSD (Full Self Driving) and not AutoPilot. Two totally different pieces of software. FSD would probably have stopped at that stop sign and then turned left or right depending on the address he had put into the navigation system and there would not have been an accident. AutoPilot was not designed to stop at a stop sign or make left or right turns at one. It did exactly what I have experienced in real life and I would have had to do the braking to stop at the stop sign. The driver did not buy the high end software or at least was not using it if he was in AutoPilot.
youtube AI Harm Incident 2025-08-15T21:4…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwLkAbxotBibd7cxp54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyRHf6ENyLqEP_dRet4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzy4ocmWdhh9fFKdxR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxBnLjIlH9s9FPB78t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx-LI2OIE09ehGIb3Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyd9Glq6Bt3SkTR8e94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxbQJn5bPEhDv9gFzx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzIodfh6WzQyQkUJ-h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyuGm2txsj8064zUPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgzNKyX1kYaY-Epe-FB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"} ]