Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is clearly a autonomous mode fail. The car had no business driving over the speed limit at night (algorithm fault). Both LIDAR and radar should have easily picked up such a target, even without any lights at all (slow moving, large cross section). Even the camera should have provided enough data for the car to TRY to react (it can react almost instantly). To say that we couldn't possibly have expected more from the car in this situation is completely ridiculous. The officer that has made the comment is no technology expert. He was shown a footage from a camera with frankly terrible dynamic range and he concluded that if he was driving while looking at the camera feed he would probably do no better. The point is that this camera does not have a better night vision than a human eye (seems a lot worse) and that the car shouldn't have been driving relying on that camera feed. For a multi-sensor equipped robot, this should have been one of the simplest dangers to avoid. It's a cars fault, unless the data from the other sensors show that the woman was waiting in the shadow in the opposite lane and that the car concluded that she would continue to wait for it to pass, and then in the last second got in front of the car. I don't see any other explanation apart from multiple sensor failures.
youtube 2018-03-22T09:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxX5YFJI2vgy_Zyhyt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwy27zHitEZBoXDeJ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx98EBWYallMqPPMvh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxiKooY4rKZoh0SSRt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx5G_XUW7ns7-yMxrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxTD9zju23ZKhg72Ax4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzj9kIO3ZzulRYXqOd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwj5yEuavPRmLqR-4R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzdJSUrC_0JQqSHhj94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxlsnAQvcFMG9H9p5F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"} ]