Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The woman never saw the car coming because it was driving in "Autonomous Mode" with the headlights *OFF*. How then to explain what you see in the video? Here's a neat trick. Take your TV/Stereo/whatever remote, look directly at the LED in the end of it and press any button. What do you see? Nothing, because the human eye can't detect the infrared beam. Now, take any digital camera, (most cell phones will work except iPhones - I don't know why). Look at the led through the view window on the camera and press the button again. What do you see? The digital camera "sees" the infrared and interprets it as a white light to your eye, *just like the digital camera that took the video of the accident*. What you see in the video is the infrared scan that the CAR SAW, not the driver, nor the woman crossing the street because the headlights were turned off. Notice there are no "Hot Spots" in the scan such as you would get from two separate headlight beams, just a "stripe" in front of the road.
youtube AI Harm Incident 2018-05-24T06:3… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzZSnnfj59UOcWUNUd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzMUqIxyPPq82ZPwH14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgznW18G3AMIE_uCDFB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwtM4pOivcujKaP98B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgxXlsYlR39y7k7YXLl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugz0LPFwCBLHB8OSH4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxKr-UyshtI6A1hDOd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxKQZ3qrwjF4MfFJGl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzBsJ6NOsaDQOsdEPx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugw6xrdTx1uwcPo-yZ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"} ]