Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
To me (as a current C.S. Student, and 20 years of professional electronics and programming experience), the solution is actually quite simple. Don't allow any A.I controlled vehicle to create a secondary accident. The only choice is for the vehicle to crash into what is ahead of it. First off, if the vehicle following the truck is following normal rules of driving (2-3 second rule), then there is no reason why the vehicle can not come to a stop before colliding with the objects falling from the truck. If there was in-climate weather, than the 2-3 second rule should be adjusted accordingly, and it should still be able to stop. If the owners of the vehicle did not perform proper maintenance to allow the vehicle to stop properly, then the occupants around them should not be penalized for the owners negligence. In addition, if the vehicles around them are all A.I.controlled, then swerving into those vehicles could cause them to react in a way to create tertiary accidents. If the initial vehicle maintains its course, those secondary and tertiary accidents are avoided. Finally, with safety features of vehicles today (which I find annoying are cited in the video to justify a secondary accident, but not in defense of the vehicle hitting the objects falling from the truck), I feel it is more likely that the occupants of the vehicle will survive the accident, without the need to endanger anyone else around them.
youtube AI Harm Incident 2020-09-07T20:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugyr0RMpIPjrTwnqobN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugx-IgbdLnExAW_EhZF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxN7dBvLWbIzar2GFB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzEykcQ7MVTCsx7g7d4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwvKTI5iOHqUc77Qtp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz0YcSTWrba2PmZtwp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugylzv1Xaz0MgWpxUfJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugw3EQZ9U6NBDWibkPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgxJ57uNPwCyWaPdS5p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxsG5XLEWsnnq4EsCR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]