Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Terribly misleading video. Only examples shown are drivers clearly not paying attention. Crux here is self driving won't require full attention from the driver until it is ~100x safer than a human. This is clearly a technology that should have support. Maybe make a video on drunk driving accidents next, WSJ?
youtube AI Harm Incident 2025-01-09T20:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyyAe5xEiiP4fGWuh54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx4gTON15A2Y6Zi_XZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwSfFIzILIF5xV5pgB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzl0ehMFyxEUee_q_V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxUZA4QEJsaYEIc6Rp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugxcp4QbX7VAinCZ8U94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwMVR7V75jzoBurH0h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgweVnr66i0YqUdRuzR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzyRG49Cqs9ZoK7n054AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwy8mfNK3DR9o9hWu54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]