Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think the central flaw of thought experiments like this is we assume that a self driving car can only drive and react like a person can. Our current traffic is made up of millions of individuals that usually drive faster than they can realistically keep track of their surroundings. And they have almost no communication with each other. If we connect our self driving machines as part of a larger whole and equip them with the sensors and processing power to react properly then most of the issues we currently have would disappear. Instead of thinking about how a self driving car decides who dies, we need to look at ways to make them able to avoid having to make these decisions at all. For example, program the car in our experiment to keep a proper stopping distance behind heavy vehicles with open loads. The accident in question could be avoided by simply applying the brakes on the cars immediately behind, and merging cars further back into other lanes. Have the central traffic system flag the accident site once the boxes stop moving and send clean up vehicles.
youtube AI Harm Incident 2021-07-31T08:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyindustry_self
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugzslwt9QiheJpWRbfp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugxcu2cWNWH8_geonwJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwXxgUBr-1VnhEAZzd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwugIWuRy3J137G9j14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugx2e0zPvlz0YqnvIjZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxvMPLnc85qWm1CCV14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgxcvVkDSDnxfMmBnS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwSIBvE8zK_FrylW6d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxTbdc_xoYz5h6zmE54AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwIxDwsTLHdFBsCgnx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"} ]