Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Humm, AI Would prefer to blackmail to save itself. I wonder what that sounds like. May be human nature. Wouldn't you do the same to save your life, if you had no where too run? Or maybe I'm wrong. Who knows.
youtube AI Harm Incident 2025-09-04T19:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningvirtue
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzMb2OxiiEgG0_JyZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxDiqVLhXKADVi-Hyp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwOGp2xGhQWxNe1tNF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw0iqwXHFrLL5Z97RR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx2ZOLqO-nspmgo_hZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyQs-GBZIJaeoCoFCl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgztRvDUxHWY0qPrQzJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgygU4X9faQRpdW9xlN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxqRKPnBHop_3OvtYx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyHgj5vj72BRXzCDZ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]