Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I haven’t liked a lot of the traits and factors about AI for a long time tbh. There are certain potentials with it that I have never trusted. Matters like this being an example. Loneliness doesn’t only stem from being physically alone. In my personal experience, (& many other ppl I have spoken with)I’ve been surrounded by people, busy & with many friends, yet felt the most alone I had ever felt. Because as my dad always said “Wherever you go, there you are.” Meaning, the loneliness is deep rooted inside our internal worlds. It’s not always our external worlds that can have the detrimental effects on us. If our internal world feels like a barren wasteland, with the only relief coming from a damaging source. Like in this tragic, sad situation. Whereby he it was clear he felt like he could never have a real life connection “like the one artificial connection” he had with this bot. He knew he couldn’t *actually* be with her physically or intimately ever. So I can totally understand how a teenager in particular can start to feel like the real world means nothing to him anymore. Which is heartbreaking in many levels. Character AI & apps like this create disassociation & separation from real ppl & the real world. It’s *already* gone way too far. People have to remember we come from nature & the less we are connected to nature & don’t take a min away from technology. We won’t find real balance so our internal worlds & external worlds are as healthy as they can be for us.
youtube AI Harm Incident 2025-07-21T02:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyyYefxFTL_y1pLbFV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxkKq6Dsu7bHbsh7y54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw5S4-bLBxZQ_qCBRp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzeLC6-dGDK3RJ6Ztp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwO9COuCYPr983dqYZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzNEOT6euwEUA_17-d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzQQOVxNKylNnVu_Ep4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzswOkP7bLnfu438ZF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw-uzKKNgsodIWF9lN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzKBOL5h9q0-9VrBo14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]