Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The main problem is that even though you "have" to be aware all the time, the system makes people not do so. There shouldn't be an expectation of fully awareness on full self driving systems.
youtube AI Harm Incident 2025-10-26T17:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgyQf8swOlrJlJChnfh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz63f-TRb6quVkhT-94AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugzt_CTJdbbK7UhZIER4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwZOEB6llkNmmbrYWZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzvLlQz5Yolfw2WykR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy-6P2nmYNVBN7RROV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgygEs_7Gz-g0haaWTR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzh1BTWwtF-0YTioSB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzcnbzmMB4Fi2VMyKh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzPIfb1n7mD-FhgUOV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]