Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You would think, if, as you relate, Harmin, the car didn't stop because it was unsure of what it was seeing, it would be a built in feature to automatically stop in such situations or at least alert the driver of a possible impending hazard. I mean please, err on the side of caution!!! I'm sure the car was not intentionally programmed to plow right through objects it was unsure of... right?! So, Uber is at least partially responsible; they failed to foresee and test for this scenario. In fact, that would be the basis for my defense against these charges. Sad Elaine was struck and killed, but she was both jaywalking outside the crossing zone and chose to step out in front of a moving vehicle at night, she is as much responsible for her death as anyone or thing. Who taught her to cross streets anyway? Let's put them on trial too! 'She had already crossed 2 lanes when struck,' meaning this was a major roadway! Look both ways before you cross!!! Honestly, I feel sorry for the car's attendant. No, maybe she wasn't paying close enough attention, but she had experience operating the car in that mode and perhaps developed an over confidence in its abilities to do what it was programmed to do: arrive at its destination without killing people or hitting stuff. And, if that misapprehension is true of her, it would be true for many drivers, so in a sense, Elaine's death was not in vain. Now they know autocars need tweaking to protect against this situation reoccurring. It is not Rafaela's fault the car didn't do as it should have done... STOP!!! The driver's mistake was trusting in the new and burgeoning technology more than she should have. That she was charged at all is a travesty. Imagine how she feels having taken a life. This story is sad all around. I'd be curious to know how many hours experience Rafaela had driving in that mode? It could mitigate her culpability significantly if it is a substantial amount of time. In the end, a rational thinker must conclude, would Rafaela look away from the roadway over 100 times if she had had even one close call? I think not. After all, consider this, she knew she was on camera! Someone died. The technology failed. It needs work.
youtube AI Harm Incident 2021-11-18T16:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionmixed
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"ytc_UgzkPEU0V78YV4aXvrd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzIDyjW2A9zsyfs9qJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwYlAx5OsGiwZsnhfZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw3nQuhz902y7u7hTF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgwP61B15cqi5auuFrV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]