Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
First of all, i dont trust self driving cars but im sorry to say that it isnt tech's fault. I think if it wasnt auto pilot but driver, he/she definitely hit her/him. There's like 1.5-2 secs, you should realize the situation first (0.5-1 sec), you move your leg to the brake (0.5-1 sec) and you brake (how many secs left for this? Maybe 0.5 sec), otherwise you should drive around to left side, if you can realize the situation. You should be really pro driver to do that. The other thing is that, its evening/night time & pedestrian should see the headlight but he/she didnt even check the road. I dont understand that. Of course driver had to pay attention but pedestrian also had to pay attention when he/she is crossin the road. Also doesnt wearing any reflective clothes. I think even the driver drove that car, crash certainly can happen. Only fault of this tech in this situations is that, not braking. I dont know why, laser should detect the pedestrian no matter what. Volvo made a comment and they said that the car wasnt working on volvo's software but uber's software. Maybe they should check the uber's software cause of no braking.
youtube AI Harm Incident 2018-03-22T12:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxzxBdh4ic0G0rVpBt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzH4QZI5SK5ZDudo5N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgzeR2MAoXpzAeoJuSt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxaPEnnVPVJKXCaAH94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzENXYaM8mBbW0okPB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzTx0_sm-U6lOTFXgZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzEyiLflYKd8fQR_zh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyclY637vkvrbuvSPF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwCv5H0mcTPDuELTat4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyGFiRB7ETNFOrLDy54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]