Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
For me it's kind hard to answer, because a self driving car that is driving at a non-safe distance from another vehicle has a bad programming, and in any event, the company should be sued. If the car is too close to stop, them it is the companie's fault. If we are talking about a world where all cars are self-driving, the car 1 (that is going to crash) could send a massage to the nearby cars, so they will re-arrange formation, opening space and letting car 1 change lane and avoid the crash. This could happen even coming from the truck, where the system notice a failure and the objects being dropped, sending a message to nearby cars in response, and they would change lanes. In any case, if there is no way the car can escape the crash, the only thing it should be programmed to do is to stop and use anything on it's system to try to save the passengers. If it crashes in any other vehicle it would be an attack to the passengers to that vehicle.
youtube AI Harm Incident 2015-12-14T23:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgjaTrm3xjVlsXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgiVJWa_Y6bmRXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghRt6TFpVDC0HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjWo4JZkIB25ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggQIWXW0Sjhu3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugh4kvsWRmbvVXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghHt-JHGMzZ1XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgiJ_L1RWjzFSHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugiygo6Qdg1iq3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggxWJ27f-_UIngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"} ]