Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The nuclear comparison will never work for country's because the danger is not apparent in that sense, what will likely happen is we keep making newer AI modes that then become AGI then ASI and we will all go along with it.. with apprehension but still along with it. why because know one wants to die.. what do I mean by that its the fact ASI will be the key to hitting longevity escape velocity making people Immortal which makes the birth rate not an issue saving all the countries suffering from population collapse from birth rate issues its just the most likely course of action. nuclear has just as many gains as disadvantages arguably has more disadvantages on the other hand ASI with so many advantages that potentially destroying the species just might be worth it because what its going to come down to is this do we all agree to slow down and have the next generation after inherent immortality with ASI or do we run to ASI as fast as we can and risk ASI mishap that has a 1% chance of ending the whole entire species if that is the case and indeed it is most of us will say both the people on top and everyone else go full speed I did not build this just for other people to benefit from it after I'm long gone and who could blame them we are talking about our lives here no one is that selfless. in a world where its a definitely for solving the immortality problem if we keep moving fast is the more than likely outcome. the only thing we have a chance at slowing down maybe is the implementation rate of the technology in day-to-day life meaning we might be able to curve the adoption rate in order to keep from going into a Great Depression meaning it gets adopted at a rate the economy can handle but all signs point to one day these robots are just good enough and replace most white collar workers and 10 years after that they get just good enough again and replace all the blue collar workers is likely what is going to happen governments will probably implement a universal basic income for a Time and let new jobs take the place of the old ones witch will likely work for up to 2050 but after that its anyone's guess, because the ASI will be so Advance you'll have a hard time finding any job it can't do meaning at most you could have several massive projects going on at the same time from people going to mars from people making massive colonies in space that hover around the Earth to moon bases just because we can not because we need to making a massive need for ASI managers who are human solving the labor problems of the future. the short of it ASI will not end jobs it just will change work to off world in the long run. bottom line company's greed is immeasurable and space piracy will become a thing. we are making a new Frontier for people there will be a big rise in off world jobs is the point, so the short of it the 1% will offer prosperity in Space by giving up your place on the planet witch will largely be a lie but most people will be put in a spot where they have no real choice but to believe it when there living in apartment complexes with a universal basic income with owning no land or anything of real value and just having everything handed to them there is really only one option left and that is to go into space to own a home on a space colony. its a reality that after we solve the mortality problem of the human race it will lead to people the Earth can support but we can't have them here because there's no land for them to have because nobody dies making a big market in space for space colonies and with ASI its 100% doable. as for if we are making a Utopia or a dystopia that depends on perspective.
youtube AI Governance 2026-03-22T09:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwkED1FLGvlc2IMmVt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyxMfxAK-Fmp-n2P-N4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwKRl54xheus_XHSw14AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugzpc_6HmyVaXvQnJgt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxZHkHpvftIabygleB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzlCvoNl4OkokzYqDt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwM1A698rswL4Wp02d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugyi3axQQ-0vFnR0bVR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxYRlbyB7iJIRA59Yt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy1HUU8J11XyI2jiFN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]