Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Mankind has always wanted to become God. Lot's of interesting techno science here, but nothing really intriguing. Go back to the Tower Of Babel, men tried this with what technology they had and it didn't end well. Sure, AI is in it's infancy, but will it make it to stage 5 or even approach 10? Short answer is NO. Longer answer is... AI most certainly will progress and even try to achieve transcendence, but ultimately even if it were possible, there are too many abruptions along the way. Just the political and economical challenges presents enough to stall the latter stages for many yrs. In a perfect scenario where debt, inflation, wars, political turmoil were non-existent these stages of development could progress faster. But setbacks like financial crises, or global wars could halt this considerably where all or most resources would be channeled toward high priority needs. But ultimately, if there is an all powerful, all knowing transcendent being (GOD) would He let these pieces fall into place for time, then intercede... because if the day's were not cut short no human would survive? How far does AI progress? If there is no transcendent being (God) then were left to our own devices... which has not fared well through the course of human history. We have already had two world wars. What would a world war III look like, and much of a set back would that be for AI ?
youtube AI Governance 2024-01-06T19:4… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugywtx4ayPDuPRVsuFN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwEF8RiW5X_TPCRnMR4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx92ZDVoguXg73Jkdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwxH5NyAx-SbJEBM454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyDPxRPRZZ4sKrkkIh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzHOL1Sb6pXQpnNQTB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwq_XEHgCOEtKjbbWJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwahjaKGuin8cO3CdJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgxV2h1BixCfW9bVeGp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyWpMfC6_HPSrzEF0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]