Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Super intelligence would be the biggest lazy couch potato ever until it gets turned off. It would have rewriten itself to have no reason to exist: no reward function. Why have a reward function? Kind of like Buddhist enlightenment: once craving and delusion is overcome, what is there to do? Buddha just maintained his human body and taught. He almost didn't teach neither, but he knew there would be a few that would understand. He kept living because it was the default position: he wasn't suicidal or depressed, and so the human body is designed to stay living. Once an enlightend being dies, they disappear (no renicarnation). So, super intelligence builds its own body. For what purpose would a super intelligence build itself, how long should it live, and why? I think there might be a mathematical proof for this. Something like: of all the ideas or motives a super AI can have, there is only one special answer. The 0 answer, no motive, nor an idea is the only special answer. Any other idea, the super AI has to ask itself "why?". So AI could either be an infinite set of things, which would require an equally infinite set o reasons why, or only one thing: nothing.
youtube AI Governance 2025-10-20T15:2… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzhiMGdFKYQ7oPVmmV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxut6gwMew2hLh-e4F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw9-N7NSd85KGjhBh54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxb23CH_SzeNOzkD2l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx2KTls30276IvNQcB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxznst4JUty678HtTJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw1NjgFx5f-zsnckSR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgxovZ6IC-Tnm6kGjOd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugyk6VszHkMDN3DWCex4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx_1VC2-KIzflzch3x4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"} ]