Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Do you know how unlikely it is to be in a situation wear you are boxed in and a truck in front of you drops objects? Most of the time people self consciously avoid trucks that look to have unstable packages, which is where a motorist would intervene, destroying your entire argument, not to mention, like I side before the odds of that happening are astronomically small. You are honestly pulling what ifs out of your ass if you consider this a valid argument for autonomous vehicles. If by some chance it does happen, the computer wouldn't be able to react because they are programed to not hit any cars, and likely wouldn't see the boxes fall.
youtube AI Harm Incident 2016-02-07T00:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugg2BtWozk8CNngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgjFkjDPjqE2CngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjEW6MP3uLTC3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgjbNTENqsljHngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugj-Tm4fiodnsXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UggOUDgUdR33tXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgjGwm-c396lkXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Uggj8ubOGU2UeXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugg-1WzQ124krXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugif6gsoLWXGuXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]