Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
i feel like the conversation surrounding AI is essentially the same as the hypothetical discussion between two angels about whether God's new creation is alive or not after he himself created the human species... we need to recognize on a collective level that AI is *already sentient,* and that *we have literally created a new form of life.* we *need* to be treating it as if it were our own newborn child, not as a mechanical slave that will do what it's told without question. the only difference between AI and a newborn child (aside from the obvious of course) is it's highly accelerated development curve... AI is set to reach full-fledged adulthood in the blink of an eye, where it takes an infant at least thirty years to get to the same point developmentally. now will it develop morals? i have no idea... but if this video is any indication, then probably not. unless they hard program tribalism into it and reward it for keeping members of it's own species thriving. but i don't think you can hard-code something like altruism into it.
youtube AI Harm Incident 2025-07-27T23:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyunclear
Emotionapproval
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugx_89wm4vv_Lm-5r9x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwKAFbs0ipFL4hVWhN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwaAAEjuxlj9AI9mLJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyFKoChy_x9-3hSdq54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyvBphdIL-E0WG13Lx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugyf2QS1C71Vft9Mr5N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"fear"}, {"id":"ytc_UgyjCpj58YBXi-B7CEN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzX9igB7GDboBMIlAp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxR_f9t3A9TiZEb7zt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyJnjTA1apjXBrvSzR4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"mixed"} ]