Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's not so much that the AI has a brain of its own. For every death or injury, the company in charge of the AI should and eventually will be legislated to create a report a that demonstrates why (or rather how) a human wouldn't have performed any better in the same scenario. Waymo did just that ‐ an injury was caused by its driverless car, but the company produced a very compelling report that a human driver would have caused a fatality instead.
youtube AI Responsibility 2026-03-14T07:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxV_oqPW5usSOsg4wJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyTm8iM6o_Cu66fwVB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx0FlTLyFp9yGbKarZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxBHlev-idY-eLt-PZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgyrT_VcOMBeN4U2t2h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzzcEP1YLIwLaaYPVZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgxGY2fw6fclRFpZ7uJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzjbpPxDByAyJ8NGe14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw0_cLPmJgDWdXOe9t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugye_hBY2twoQZz9ABZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"} ]