Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Is it bad design? One can imagine circumstances where input from the driver can and should override what the automated systems are doing.
youtube AI Harm Incident 2025-08-15T19:2… ♥ 11
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgwRdWl9Hn_57Rcf3bJ4AaABAg.ALrISh_2tu8ALreDBfRGnJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugz05N2k2HstAfTObnx4AaABAg.ALrHidOVNdvALrZs0HtxgH","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugz05N2k2HstAfTObnx4AaABAg.ALrHidOVNdvALtOaIB2B4K","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyTq7JuerURsRbTaup4AaABAg.ALrHCB-6LCsALrHf0t6nK2","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzxC_zUpmJISuupmWd4AaABAg.ALrH84MDSFpALrIUbJvpMx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgzxC_zUpmJISuupmWd4AaABAg.ALrH84MDSFpALrYBCn4z1O","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzMGVyHLrEuoXa4Ffl4AaABAg.ALrGv8aYGBwALrMoXZk1S_","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgytZ25ed6ApFl6kN_R4AaABAg.ALrGhlcCiwHALrHrQx_k6","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgytZ25ed6ApFl6kN_R4AaABAg.ALrGhlcCiwHALrWvR17sWi","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwdMfgxGbxCg529RJ54AaABAg.ALrGaBip6kZALrHMNqDVfv","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]