Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The self-driving car should always stop before hitting someone else. If the car can't stop in time, it was poorly programmed and should be reprogrammed to increase following distance. If a jerk cuts off the self-driving car and spills something and collision is unavoidable, hit the irresponsible driver who instigated the accident by producing a problem that could not be resolved without incident. It's fair and unbiased.
youtube AI Harm Incident 2015-12-28T16:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugj_f2_hIfbFIngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgjuSAOvpXKjoXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgjXkfuodsaTaXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgiH4bJgUd72t3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugis_iWcr_zaLHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugg1rrdyzbR2AXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UghCPalsjYnrLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgjHJF2WYdJEkngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiXr1C50oWCgXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugh_wlHO5sE7gngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]