Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It might be the case that people will die with autonomous driving. However, it’s important to balance that with lives potentially saved due to human error. It’s much harder to quantify that, but it’s still important to give context to the issue.
youtube AI Harm Incident 2024-12-21T22:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyFO-9eC5zlG_hNGah4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy6yOKkwiyREQn9Tcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgweJGnMI6_EBDZwdvp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzr6JIuyym3nDLWXFF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwXXQuF4QbXVXAx_Ah4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwmE1kGedyjXd6UD0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy-2IfQ7P4otztnO454AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzoyb7wAsOqXml2Ffx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxZCHcUzzyd5UhZgIB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy7-6-aXZv1BjShoNV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]