Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​@GhibaGigi-q7myeah no. He clearly wanted to die, he just used a bot to help him. His parents failed him not a chatbot
youtube AI Harm Incident 2025-11-10T02:5… ♥ 2
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzqyAYKW2YDgx9CZsd4AaABAg.APEUcjr1Y73APGi5YRyEGP","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugwht8eZlgHwBYQOfrx4AaABAg.APETCTfyHwRAPHVjHXKBdy","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_UgyIOqMIpxK_cozAaoF4AaABAg.APEO_5MAyJkAPEY8GTHS0q","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"hope"}, {"id":"ytr_UgyIOqMIpxK_cozAaoF4AaABAg.APEO_5MAyJkAPEjZ2fVyKO","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgxQeCKAmMV5KciDSA14AaABAg.APEOFZlmTQUAPF_mEs5-rY","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyWKVdG5tHDZDiBJUt4AaABAg.APEJrskWRiFAPEUtojKh85","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytr_UgzQS9AzMAjsga73BDd4AaABAg.APEJqE9eM8jAPKYDf63YAS","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzQS9AzMAjsga73BDd4AaABAg.APEJqE9eM8jAPLqgqEtCpx","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugyqdv5SCuRJBXdjt5R4AaABAg.APEIxJGP984APEmEnAhwTL","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxkY6bGcJcb6rvcC-94AaABAg.APEEl9ZYHGAAPEG6xiNdfN","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"} ]