Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What if we programmed the self-driving car to act like a human during accident? A "panicked" reaction, one chosen at random, for each accident?
youtube AI Harm Incident 2017-03-23T05:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugj6Ag92fNV7UHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiCg24wo3traXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugj_fCP6PonVvngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UghM__dbpEusUHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx5GQ9AU56VhniOsNV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzGmLXAowlf8hDSqX14AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugj8wwkqOXZiYngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzaAz0xLcPvpdMdpLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UggecR9mOfVpW3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugg8WxFS06Ct3HgCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]