Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Your examples are from years old tech and you are confusing examples of auto pilot - think fancy cruise control and FSD - full self driving. In both cases the driver is suppose to monitor and correct any problems. All crashes are still driver error…
youtube AI Harm Incident 2024-12-14T17:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwO1DwBUKrXUTVCJLB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyIxKaWNvuje9mLEnR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw38bHPL9gHZuyZFRh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugzlcdppz8gsJEhC4T94AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwgNe4trca77ldco1p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgxtfYY_ZuT0_3OAidd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw0TXVAD76SmnkTqxp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwpC-0bLmiosdzj76p4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx-6dulXef4tjTkbxd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyaBb03d1K8bS2xOjh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]