Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I fault the driver, who was obviously lackadaisical in "doing their job." Any Uber driver, bottom line, must still be responsible for "driving" the vehicle, and when push comes to shove, they must be ready. I also fault Uber for giving this or any other "driver" a false sense of security with their (as yet proven safe) new technology that they don't mandate a driver to still behave just like a regular driver. This is still in the testing phase, mind you. In fact, this driver exerted so little effort while on the job that it appears they were texting or watching something on their cell phone. They are not working, and therefore, they should be fired. Disgraceful. ~~~~~> Sam Abuelsmaid, an analyst for Navigant Research who also follows autonomous vehicles, said laser and radar systems can see in the dark much better than humans or cameras and that Herzberg was well within the range. "It absolutely should have been able to pick her up," he said. "From what I see in the video it sure looks like the car is at fault, not the pedestrian." Smith said that from what he observed in the video, the Uber driver appears to be relying too much on the self-driving system by not looking up at the road. "The safety driver is clearly relying on the fact that the car is driving itself. It's the old adage that if everyone is responsible no one is responsible," Smith said. "This is everything gone wrong that these systems if responsibly implemented, are supposed to prevent."~ http://www.foxnews.com/auto/2018/03/22/experts-say-self-driving-uber-shouldve-spotted-pedestrian-in-deadly-crash.html
youtube AI Harm Incident 2018-03-23T00:4… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyliability
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw8Elb47QXaMnsIVHp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxpWi3R9zAp4WeoPAh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxdp1aICHaICCLXvlV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugzd9D1T2X3JpSTfBVx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyeNF5xAtoIKdUIuVV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwENzbrg1avqzgVN3J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw13AXRy9I6snSeQXt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgyyFxpTrUO1rio8K4Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwRr4JA8bOXyO2ombB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyhCoRhkNrPE87FgSZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]