Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I mentioned to him earlier – it's 320 million miles "in vehicles equipped with Autopilot hardware". That means basically any Tesla produced since 2014, since all Teslas include autopilot hardware. Some owners pay more to get autopilot itself, which is a software upgrade. Any Tesla with Autopilot hardware has automatic emergency braking, regardless of whether the owner paid for the software upgrade. That means that "in vehicles equipped with Autopilot hardware" basically means "vehicles with automatic emergency braking". A much lower fatality rate with AEB is not unexpected. But it doesn't mean that autopilot improves safety, it means that AEB improves safety. Good – and it should become a standard feature – but it is misleading to say "1 death per 320M miles for Tesla Autopilot". On a broader note, I hate everything about how Tesla's blog entry was written. It's pretty misleading, and also implies the driver was given warnings that there was something wrong before the crash (not true on a closer reading).
youtube AI Harm Incident 2018-04-03T17:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgxnAxQEhpo_S2ns_Qt4AaABAg.8eZHUlbvIAx8e_xeXebC8P","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugyn1ZengAJgfFbr2NZ4AaABAg.8eWyftmTRXS8e_x_CkpXj_","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyyrcUSwtvoJEu5c2F4AaABAg.8eWczd0P9p-8eWytbPTdax","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyyrcUSwtvoJEu5c2F4AaABAg.8eWczd0P9p-8e_yU-dV9Zp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzRZS9mm1mC03HJyOx4AaABAg.AP2Nj4c5DzQAP6N4q7M_ft","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzRZS9mm1mC03HJyOx4AaABAg.AP2Nj4c5DzQAP6S6NKvAo5","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxgzde0DDPcpert_N54AaABAg.AA27Kt28vnCAA5_GRL6S4W","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugxtt1T7bm28i-usq9l4AaABAg.ATDO4waZ6lqATQek20YsXp","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugx_BMhSBu-Fnm2GIRJ4AaABAg.AJ2Ub2cK63xAOE2hNFsSMS","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwBpYkqedHvqSiYhiZ4AaABAg.AGkc1CPNgVEASDzBIM2zZo","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]