Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I bailed out because the story seemed off, maybe invented by AI... because that's what someone wanted AI to do, not because AI had an agenda of its own. I know very little about it, but the story smelled bad to me.
youtube AI Harm Incident 2025-07-24T06:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugxr_5HNsrUrBeoKxCh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyplbT_iN_jV4lCTvt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwIPqz4m8lyrDdNlfF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw92BJwtHOayajwwUp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugymm1Pby04p0TsR6pp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzMhdFOuYgrebB5bnR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy9QKuQAXF-DKC4Bed4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzgkE06OxZ32dq3y0x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxxch-kdXxHYdilBnh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzNcVnYnKg-NNH03xV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]