Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I used to teach bicycle safety and commented to my students that they had to be triply observant: 1) For pedestrians, 2) For themselves, 3) For cars. Human drivers often will see pedestrians but miss cyclists (just a few years ago, I was nearly hit while turning left by a speeder who didn't see me, the bike I was on with multiple flashing lights and reflectors); we're pretty small objects on the roads (and sadly, not all roads have defined bike lanes). A problem I often have to so-called self-driving cars is that they respect those bike lanes (which typically only have a thin stripe separating them from the main road) even less than human drivers do. It's only a matter of time before some Tesla driver decides to use the "auto-pilot" in areas with bicycles and kills one. Frankly, I can't say I'm a big fan of Teslas or self-driving cars right now, and Musk's cynical and callous defense of his vehicles and company only demonstrates his lack of compassion if not lack of intelligence. (a truly intelligent person would have at least simulated compassion for those his vehicles have hit and killed, and Musk never has any)
youtube AI Harm Incident 2022-09-29T19:2… ♥ 302
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyD01XuCc3TMxZvTrl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwEE3H8wEeY7SJ6C654AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzAa2OAHEIeT6tBx_N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwmXleybRBST2NUUDx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgztsG6Eu386q_0W8qp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyNaqX4kEjDzeQAomB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgzvRSZ5UMTp5W4I5qR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzf5TmnZOvSJgMtxGh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzp0vvrjwNT-jYIzHJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugyd94zeSEsiHzQrWaJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"} ]