Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If the driver was notified 19 times to take control of the vehicle, it should have stopped the car or disable itself. It shouldn't let the car continue in self driving mode.
youtube AI Harm Incident 2024-12-14T10:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgzrwUzVj-yRwNmfZTR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugw4byAxNBbCL1z9u_x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzhE2HVAjVUsma6l714AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwRIXGmQJHrCKEwiml4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzS438WTHMQ9Gs1edp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwXP-e2CZq2Y_Pw_eR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzFcU90ipf46WISLyx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxEWGG1uyDP6r_ZFl14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwqmTPbRHDLVByMWk94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxSNbNTrVFUorXGCnR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"} ]