Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The lady might be faulty according traffic law but she wouldn't be hit If the car was driven by an experienced driver because human eyes are more sensitive to objects in shadowy area. You can test it by taking a photo in a place like the video. In the photo, the shadowy area shows completely dark just like the video. But if you use your own eyes to watch, you will definitely see the silhouette of a man. Besides, an experienced driver will slow down and proceed with caution if he can't see the situation ahead clearly within the braking distance. I don't know who wrote the code of the self-driving software, but its logic seems like: If the camera doesn't show anything, there will not be anything, so go ahead with full speed... But if a human driver was driving the car, his logic would be: If I'm not sure there isn't anything ahead, slow down and proceed with caution...
youtube AI Harm Incident 2018-03-24T02:5… ♥ 4
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"ytr_Ugx9Nu0VBZfVU2rahLl4AaABAg.8e7Lllqtcq08e7t3_C7dRO","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugx9Nu0VBZfVU2rahLl4AaABAg.8e7Lllqtcq08e9TEu4DbZV","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugx9Nu0VBZfVU2rahLl4AaABAg.8e7Lllqtcq08e9eRLcQCAF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzeqRnh0XLiG23w8cl4AaABAg.8eOdBGEi2d695FC7nU4tE-","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzeqRnh0XLiG23w8cl4AaABAg.8eOdBGEi2d69jhVqpUdfNS","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}]