Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I see two possible reasons why a self-driving car would have a pre-programmed decision for such a scenario: 1) A programmer anticipated the car being in just that situation and programmed in a rule to take care of the outcome rather than coming up with a rule to avoid getting into that situation in the first place - like "don't tailgate when you don't have room to swerve". 2) No programmer anticipated that exact scenario, but there are sufficiently broad rules for scenarios that are close enough that they can be applied in order to get a decision - which may or may not be a good one. In the first case, that programmer is responsible for the outcome of the situation (though anyone who imposed additional constraints on his work that prevented a solution that avoided the scenario in the first place bears their own share of responsibility). In the second case, barring negligence, the programmer is not responsible for the outcome. In scenarios where a self-driving car is actually boxed in, a large part of the problem is the non-self-driving vehicles doing the boxing - it's very easy to design self-driving cars with "flocking" behaviour that would allow them to avoid a collision provided there's a way to avoid the collision if enough vehicles coordinated their movements - the reaction time on these things is short enough, and the individual decisions simple enough, that each vehicle can act autonomously and the net effect be as though they were co-ordinated by a single processor rather than simply communicating by their motion. So swerve toward the SUV, which will swerve or accelerate away from you (meanwhile, the vehicles behind brake sharply)...
youtube AI Harm Incident 2016-01-16T03:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugg6IX-uG5XQOngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Uggqx26B0vYlNngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgjwZCpf6uJ5EngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgjQFdEz8fzO-ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UggF86o_OEFCZHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugh-bk-TAV7aFXgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgileDub0CwddngCoAEC","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgjpqrVAg7rgYngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UghQCXhv7515e3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgisOSWSkQ0bTXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]