Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
LLMs do not understand the subject, they just know the words most often used. Are you ok with your Dr hallucinating your treatment?
youtube AI Harm Incident 2024-06-12T18:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_Ugwk0whq-SpmnoF2HRl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},{"id":"ytc_Ugz-ycPS_WkDIeRalSF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugx8R4uNc_0zYwkKJqJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz-c1FxU5o2uJaMe-l4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgyPlcPV64PVaXMSlt54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_UgyciZmEJLVeKa2Sd614AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugy6ZWikFdFvlFvYW5R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwh2yaYSm37A7o3Hr54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgwFphvk8XXjgyupt4p4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"ytc_Ugxdknz3VER-csLewep4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"]}