Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
To be honest I don't think I could avoid the collision even if I am focused on driving when the person suddenly shows up on the dark road like that at that speed. If I have to choose whom to blame, I would choose the technology failure. It is the easiest thing to blame and can force engineers to improve the outcome for similar situations in the future. I think the algorithm for such situation is to anticipate the potential obstruction on the lane where the car is more inclusively (any object that moves on the lane and on two lanes of each side of the car will be included). One theory I have is the LIDAR worked and saw the woman but the speed calculation / estimation for her was not done correctly or in time for the computer to decide to react differently. This is a tough area for computers: translates and interprets the moving objects into effective decision making factors. Yes it sees the woman, but for some reason it does not think she is going to be occupying the same place as the car at the same time.
youtube AI Harm Incident 2018-03-24T16:1… ♥ 2
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwRetTsi4i0BqNRF114AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy9Q4IOXYexIL_Uknd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxcAqQObdXaeSzh81B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgwX_g2oZkBEcBYpK1x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugwo3xZi5Qa15kzDWnR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy1QL1_yfOFfFRBVvh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyTduAF4Rg9I0AUbUx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzsDN3E4w5XR8_3azJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwm7oRC8jx_I495dNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwr0t6NT-Q7TyAiMbd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]