Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It was not at all the pedestrian's fault. THE CARS HEADLIGHTS WERE SET ABSURDLY LOW. Essentially the monitor was driving blind and could not see more than 40 feet or so in front of the car! The normal range of headlight settings does not allow the lights to be set at such a low angle. The only way that can happen is if BY DEFECTIVE DESIGN OR MANUFACTURING FLAW the adjusting screws just come completely out of the thread. There should be a stop on the end of the screw to prevent that. Apparently whoever made the last headlight adjustment - probably as part of his/her normal procedure, ran the screw to the limit of adjustment and then adjusted up til they hit the target. But since the stops were not there it ran off the end and there was nothing that could be done to correct the problem without disassembling the headlights. And, probably thinking it was a flaw only on that headlight, did the same on the other side. He/she had to know what happened BUT THE CAR WAS RELEASED FOR USE ANYWAY. Some supervisor probably thought since the car could "see in the dark" it would be OK to drive that way until there was time to do the repair. (maybe the monitor himself had driven in in for the maintenance and the technician didn't even know how to disassemble the headlights. IN EFFECT THAT MEANT THAT THE CAR WAS BEING DRIVEN WITHOUT A MONITOR. Whoever made the decision to let the car be driven at night with the HEADLIGHTS TOTALLY USELESS (because they could not detect anything until it was so close that there wasn't even time for the monitor to hit the brakes) should be held fully responsible for the accident. THE PEDESTRIAN WAS MISLEAD BY THE FAULTY HEADLIGHTS INTO THINKING THE CAR WAS A LOT FARTHER AWAY THAN IT REALLY WAS. The pedestrian was not remotely at fault here. Obvously either the AI and/or the LIDAR was also faulty because the LIDAR does not use the light from headlights to work. The person making the decision to let the car be driven at night knowing full well that the monitor could not see, in effect OKed the driving of the car without a monitor and should be tried for criminal negligence.
youtube AI Harm Incident 2018-03-25T12:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwQSXbU7Q85q8b76mZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy8EZtgCj1XaAKd_s14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxHMbTy_aU6JBgBrn14AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyYubXqVIHSqBEzcAF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz8tm6hAbM_7A28spF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwGgDThB8optVdQIWJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyh6nok0wtpZFnkLD54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxxROSKaA2ZUVm1sXJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyI0Cmp8JRJTnOvgsB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyBfTen42fojrSasHl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"} ]