Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Why not choosing the decision which will harm your vehicle only. Even if it means a severe accident. Self driving cars are, as said in the video, also being used for a reduction of traffic accidents. Thus everyone who buys a self driving car can be told that it will reduce the drivers probability to have an accident from a statistical point of view. However, as long as they are also being told that the car will sacrifice itself in those rare situations in order to harm no other. Then from a judiciary perspective the (hopefully not dead person) knew the "gamble" and cannot sue the companies because the car owner agreed to accept the chances of being in an accident. While keeping car safety for non self-driving cars separate from this new technology. It also prevents more accidents from colleteral damage and therefore preventing ethical issues like these in the video. P. S. there are way too many situations to handle. This is just a thought of mine for the particular problems stated in this video. Have a nice day :)
youtube AI Harm Incident 2021-08-10T09:0… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugzslwt9QiheJpWRbfp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugxcu2cWNWH8_geonwJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwXxgUBr-1VnhEAZzd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwugIWuRy3J137G9j14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugx2e0zPvlz0YqnvIjZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxvMPLnc85qWm1CCV14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgxcvVkDSDnxfMmBnS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwSIBvE8zK_FrylW6d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxTbdc_xoYz5h6zmE54AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwIxDwsTLHdFBsCgnx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"} ]