Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ok exactly! So the solution is to have a "self driving car"? I'd rather take my chances with a person driving who could have recognized an actual person. Not to mention the person in the car couldn't even stop the robotic car? Ridiculous. Most driving accidents happen when people are on their phones or impaired. Which is illegal in most states anyways to not drink and drive and/or be on the phone.
youtube AI Harm Incident 2018-03-20T17:0… ♥ 2
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyban
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"ytr_Ugy1IiJDptCA69-aTt94AaABAg.8e0pid76WvD8e10TltVO-4","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx5NDCKWssNtibFKb94AaABAg.8e0os6B1Y7_8e0sZtrga3H","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytr_Ugx5NDCKWssNtibFKb94AaABAg.8e0os6B1Y7_8e0wAZvxtUy","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugx5NDCKWssNtibFKb94AaABAg.8e0os6B1Y7_8e1Sf2lFf7j","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"rdc_oi0ig4o","responsibility":"government","reasoning":"virtue","policy":"liability","emotion":"outrage"} ]