Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Actually I'm just going to ask one question how do you lose a f****** child to an artificial intelligent telling them to kill themselves encouraging I mean why do you did you kill yourself from an AI like it's so stupid do not value your life he maybe had problems but seriously these parents need to check these damn kids phones or better yet not give them phones
youtube AI Harm Incident 2025-12-14T15:3… ♥ 2
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzQzRcLnU6B9zZ0mD14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw_afGuKz7E93bROZF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxZjwi4oOnN-wRPfDt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxb2jq8nAKGR07d50t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxfTy2y-LpPFBtex2R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyuHGYnCaOTj0tkUrp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy2LEd56q56xN_6nR14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwV85ULfb9uwz67au14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw275sdxbSf9iBtClh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx9Jh_635cvO57e3AB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"} ]