Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There’s not enough compute power or energy to support what would be needed for AI to take over (if it were possible) so that’s a problem. There’s infrastructure that needs to be built, power generation that needs to be approved and constructed and multiple other issues at play. That’s gonna to take years or decades. Also, I’ve seen so many technologies that were “the end”- 40 years ago, ATM’s were gonna take all bank teller’s jobs. There are multiple branches near me with tellers in them to this day. The technology is there. We don’t need them, but customers want them and the business couldn’t figure out how to solve that problem. We were going to be a “paperless world” 30 years ago. Plenty of docs are still printed (I’m quite tech forward but I just printed stuff off today as it helps me learn). We have had the tech to replace paper for years and it still hasn’t happened. 8 years ago block chain was gonna change the world. It hasn’t. These things take time. Businesses and people are cautious with change. Just because they can do something doesn’t mean they will. There are also technical and financial limitations businesses have. We should be just as worried about hedge funds buying up housing and creating a country of renters with no means to own a home. That’s a far more grave existential threat for most people than AI is in the next 10-15 years. Yes, AI is a real thing and the technology will continue to advance, but history tells me it’s takes a painfully long time for these things to mature and proliferate. Adapt now and try to be in a place where you’re ready in the future but don’t live in fear like this guy clearly does (even though he claims to sleep well at night).
youtube AI Governance 2025-09-06T04:1… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxqOj95XMKjdf0eX0N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxyvb9aX2JJWTv0IIx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyowqFNMVyQ99W_L_R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgygVhE-SVJILD7EVD94AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyjExM0gqbSqeuCan14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyA0dm3rXOUPjQkTst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyuKCw0Dfo5Ht0ts6Z4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzntTTFQ24KZOALNVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwdjqM1kNEk3Mfl48Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyknIg6FVXSCs_qpup4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]