Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Give is 10 years and when the AI's answers hallucinates and says the patient needs to be put down. "Aight, let me grab my 20 gage." Because the doctors become too reliant of the AI's answers.
youtube AI Harm Incident 2024-09-08T08:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxRm8yl2w838IHNVX54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxsTpZZk4pTD404UDB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwiHQ38loZQkDCmcp54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz39d2s3mdz5BBoUWl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyE9j2bh1MTqD62Glx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy2EHjAxJgau4O8G2p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgzhhwBG5Zvi_WraASV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx0sKkHQBaT9aUtbP14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugy0QW4igRK970lTN9t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz2XmXuhSxBTRk_0iV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"} ]