Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The other thing is the idea of AI improving itself by feeding itself its own product. AI ends up eating its own asshole. This is what hallucinations are. Ai ends up doing its own thing and we just sort of become irrelevant as ai hallucinates itself into some kind of sideways reality.
youtube Viral AI Reaction 2025-11-05T05:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx7lB9fw-nbf8ooL2N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwRDowZCXEs3PSE0oJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgyBCWGrZQW0vBzIpth4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy_EaV0gmwu9mo76T94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyIYpm8XlBgrW8qkvN4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_Ugyd3kxRl8deHjfqqM14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugywxq8VuNBLNs7G8bh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy9FVsenHPrB-z2Tch4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyTT2klb9V9FxQYC2N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzS1heSlPcsFLJCdqp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]