Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This video is wrong on so many levels. First of all, the scenario with the truck would never happen, because self-driving cars would always follow at a distance that they can safely stop. Second, even if that scenario would occur, the car is not factoring whether to hit a car or motorcycle, it's not even able to determine that there is a car and a motorcycle, only that there are two objects on the left and right. If an object were to fall from the sky and this scenario to occur, the self driving car would always stop. There are no self driving cars that try to swerve to miss accidents, this creates unpredictable scenarios like swerving off a cliff or hitting other cars. I'm not even going to touch on the craziness of a car trying to decide whether or not to hit a motorcycle with or without a helmet. How does it know the driver is wearing a helmet? Are we 100 years in the future and these cars are fully cognizant.
youtube AI Harm Incident 2017-06-28T12:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UggQprZepafZ1HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UghRftAajpYgC3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgiVNu11IS5PiHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugis3gL-vgXrpHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugj3h2tFVAqPSngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugj5A0pJm2zcoXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggIGhHRenxDK3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgjS60trIUKAvHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UggCEcSJA552hHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgitMxhB_OZFhXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]