Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
unfortunately what does video does not mention is that the self-driving car is supposed to put itself in situations that reduces the risk before anything would happen. the self driving car should already be programmed to have at least a two second buffer between itself and the vehicle in front giving itself enough time to stop or slow down to reduce the risk to the self driving car and its passengers.
youtube AI Harm Incident 2016-02-08T16:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyregulate
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugg2BtWozk8CNngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgjFkjDPjqE2CngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjEW6MP3uLTC3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgjbNTENqsljHngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugj-Tm4fiodnsXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UggOUDgUdR33tXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgjGwm-c396lkXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Uggj8ubOGU2UeXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugg-1WzQ124krXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugif6gsoLWXGuXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]