Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@KsazDFW I actually looked up the findings, the report itself after the initial investigation. I don't remember the wording now, but what it actually seemed to be saying was that the car stopped controlling in some cases near impact. This would make sense if the software realizes that it has no good choices and would thus REQUIRE the human driver to take over. What it does NOT imply is that Tesla is trying to avoid responsibility, since there is obviously a data record that would not absolve them of the car driving recklessly until one second or less before a collision. There are naturally going to be accidents, and some of them could be avoided if the AI could be made "perfect." That is of course impossible, but that doesn't change the fact that there are over 5 million miles of "autopilot" driving, on average, between accidents, whereas for all drivers the average mileage between accidents is less than 500k. Tesla's AI is safer, period. And always improving.
youtube AI Harm Incident 2022-09-27T13:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugx0v9_oun-TqDb_0mZ4AaABAg.9figAgQOfkz9gUHJkG63DI","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgySIcK9ZqEB5BDABx14AaABAg.9fgvU2SihCe9gJt8KrubWI","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytr_UgxEfJtmETgub6C6jAR4AaABAg.9fgXdjFqUDY9fhUNDRrbI2","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzRiNdrKF-yX7lI5Nd4AaABAg.9fgUKTslEAT9fhkO-hK8Gm","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytr_UgytqgQTmw0Ybxc1fIJ4AaABAg.9fgFC97_nDn9foSsCbOSQz","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxfUYyk4gnRyHadMnd4AaABAg.9fgBu2LM56c9fguwmqvc6S","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgxfUYyk4gnRyHadMnd4AaABAg.9fgBu2LM56c9fjGPDzSbSi","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgxfUYyk4gnRyHadMnd4AaABAg.9fgBu2LM56c9flmnUOnnkn","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxfUYyk4gnRyHadMnd4AaABAg.9fgBu2LM56c9flnA4EJuQX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxEf-7CTmm69HqPRBJ4AaABAg.9ffSv_5EmjC9ffbPfiLBzS","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"} ]