Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I find it interesting that the theory of singularity is treated as an inevitable fact. It may not be. The amount of energy required to run LLM AI today is enormous and much more will be needed for a general AI. Will the amount of power needed be able to be generated? Will there be a financial return on investment? It is assumed as fact that intelligence can grow exponentially when we have no proof one way or another. If it can, wouldn’t you need an exponential increase in corresponding energy generation and computing resources to support it? It also implies that there isn’t a limit to intelligence. There may be such a limit, a constraint of physical laws such as data or energy transfer. Nature does it organically after millions of years of evolutionary optimization. We have no guarantee that we can replicate and exceed this result. Any one of these factors would prevent a singularity event. The AI bubble will eventually burst.
youtube AI Moral Status 2025-07-06T08:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgySqv4ftpCRdvpQ_L14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwIWsHI6ARkvhdMqqN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwXubWUW-LwNbn8Hgt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"outrage"}, {"id":"ytc_Ugw0dloPErJxm-odayJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxHHXRUt5V63NpCfIF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzXRuWiJE0yUNdK3Od4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxLbAXrxfVPmRA3YoR4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxXsxbaPCKcB63q5qZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz8UNAAWABCIXxALxB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzuO8_rT-LqjO_8ZaB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]