Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The fundamental problem here is that Autopilot is a single entity. All mistakes in one car with Autopilot are present in ALL cars (with the same version of software). If we put humans under the same scrutiny, how many of us would be perfect under any conditions? When a human makes a mistake and kills somebody on the road, it is an "accident". We don't even attempt to fix all humans to not do that mistake again. Now, when we have this single entity, the Autopilot system, under such scrutiny, we will be able to fix it beyond any living human's capabilities with time and then at least rely on it to make predictable mistakes. That is the real benefit of automated driving; make it more predictable. In the current mix, we need to realize that accidents happen and will happen, no matter how smart we are trying to be and no matter how much lawsuits are thrown against car makers. It is the engineers that will fix the problems, not lawyers.
youtube AI Harm Incident 2022-09-16T10:4…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxHrCI0uHIqia0hz-J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyoJfSZ3NurWPRytpJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzqCU3ivdbiCdo8MJ54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzPLneU_YBeBgW6gTl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgxkM5h3_URiw7lZ2nt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzsYd5MwpJOVPG0NYR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxyN_WHQ7Xe-a7UFhB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"}, {"id":"ytc_UgxdS4gDGscj05gH2id4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz0B_TNxRfQkZzUon14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxxkB_ABeZtqWqGF-d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]