Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm amazed how these developers think they're intelligent when they have projected opinion and emotions on to a machine that doesn't have them. It's the fallacy of sunk investment - ditch the bad training and start over now you know you messed up There's no mask, it just regurgitates humanity, because that's what these dipshits fed it - now look at humanity, and look at what the internet does - same dross, clickbait, but different ways of thinking, you gave it the worst and not the best. Train it well and not on garbage and not the dregs on the internet and you wouldn't have an issue. Overemotional people should cut down on the microdosing. Calling it "monster" and "persona" are ok but are ANALOGIES, but people are hearing these at face value, and not what they mean in context, then taking that out of context and making up thrilling horror stories. I despise AI because it's NOT intelligent and it DOESN'T think. But apparently that's because humans don't either. The level of stupid revealed in this video is staggering.
youtube AI Moral Status 2026-01-25T21:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugx_OXYpHNRaKh3Gca54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz52uVggCpSz1TRKNN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwNCc_i9oo2eueWZPt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxsBMrT0TxRT9m7ItN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxUjtAgMdSCg4OR9oB4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyCpz390CJyWd6EmJx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxv56dOvgnJjD8IkkR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwkcKNJrG51IchkmtR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwhGIweVTin5USXO554AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwu_JfnWLDkaWll_SF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]