Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Every single "AI Takeover" "AI Dystopia" scenario I've seen relies on one SINGLE idea: that AI will reach General Intelligence. The thing is no one actually knows how to reach General Intelligence. Despite optimism by the "experts" General Intelligence is nowhere close to being created and will never be possible. That's right, General Intelligence is not possible. To create General Intelligence is to create an artificial human being. And we don't even understand our own intelligence or why we are conscious while animals are not. Simply put, we have no idea what intelligence even is, so replicating that with AGI simply is not possible with our current understanding. AGI, if it ever happens, we will not be alive to see it. In the meantime, something that could happen and already IS happening is: Dead Internet Theory. As these "dumb" AI's continue take over the internet, eventually there will be real people actually using it. Maybe humanity may one day abandon the internet all together and go back to life before the internet.
youtube Viral AI Reaction 2025-11-24T07:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgzJ-y8Pp3yFIMAzWmd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugws_5ZUMVEG9YlbIEF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxcnkckF_o1LTt_YON4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx3HK2DgJvCnPIZ-rJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz4FpwQid89c-hrcn14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugxg1vvJr4Jf_WSPIs54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxbzHx7qfMt7WoUsf14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzHLdGH6dLa3_EJWHB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz4_sjyXXKN5t2Bni14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzC4IBToXE0iyOfxD94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]