Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
the liability lies withing the person/human with the stupidest decision, if someone is stupid enough to RUN in front of a moving vehicle then they are responsible. automation is the responsibility of the purchaser and the manufacturer. (simply put, a child chokes on a small toy piece and the company is sued for negligence and the parent isn't ignorant of their childs activity, apparently) as a side note, autonomy should only be in use on long drives, city areas NEED to be manual, no question, maybe after midnight, cars can become taxis for the drunk, but then again a humans should drive the car to control the alcohol induced individuals, ensuring safe return and responsibility. country roads, highways, motorways, all those long roads should be automated for the lack of human pedestrians, if the population of humans exceeds dangerous obstacles then humans should drive, only someone lazy enough would expect a car to drive them less than 30 minutes to a destination. sure it seems nice to have your car drive you home, but that's just as dangerous in a city, drunks fall in the road, forget to look for cars, ignore loud noises. city automation should be allowed ONLY for the blind, and the car should travel slower than the speed limit. automation is a good idea, but lazy people already scam wheelchairs and other aids, imagine how lazy people can be if they don't feel responsible enough in a giant metal coffin to pay attention and keep control.
youtube AI Harm Incident 2014-05-25T16:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgjmT9M6pRPF63gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugjb5mbCYWFOZngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugh3t6ctXcIqLngCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgjhRXM2999pMXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjWeooRvjbb43gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgiQxrJhhXFfdHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgjQXLSSJdVGungCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UggumOOE0wMPhXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgjQ6QxetkdMEHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugg5W5f566W6tngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]