Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I mean first instinct would be hop up front press the break before it crashes. This person trusted no one to literally drive somewhere. They all going to need waymo help to navigate life.
youtube AI Harm Incident 2025-01-10T23:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyindustry_self
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzmJCsTbrqezAaZtyN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwWngqlbfFCjPT9RmB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxT7kv52BvEvV_Itcd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwQ5g2svMv0023hhhZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwKS27RRiyuSwNHqy94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugwr-Fkd66JSMFuZLXt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugyksc0655N7GQDcwuV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgytzcimCE_R61NRj154AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgyGErxOvOzO0VoB16d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwAhnlhjdKZSKU3Rld4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]