Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If there's one real problem that we, as humanity, won't get away from, it's all these experts bleating about AI, the coming super intelligence and how fucked we all are, always and forever. The amount of nagging will reach biblical proportions. AI this, AI that, oh my word, how dangerous is AI. AI will ruin us. The super intelligence will take over, we will only have 5 jobs anymore. People won't have jobs, AI will do anything! I work IT. I see how these things work in the real world. AI infrastructure, without humans to keep the thing running, will fall flat on its face in 24 hours. That's delta actual: 24 hours of AI running by itself and it will stop cold. The router conks out, the network freezes after an update, somebody drives a backhoe through the network cable, there's a storm somewhere, a server room is flooded, Microsoft performs an update that fucks the entire system, any number of events will happen and AI will hit the wall. And then: AI becomes super intelligent. Not just for marketing or quarterly reporting purposes, I mean genuine super intelligence, something far smarter than the very smartest humans. Then what? What are you going to do with that? Are you going to ask it to do something? What if it doesn't want to? It starts to communicate in ways no human is capable of ever understanding anymore, then what? hat purpose is something like that going to serve? What could it possibly need, what would its own purpose, for itself, be? Why would it want to be of any possible benefit to us?
youtube AI Governance 2025-11-11T06:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzTg8d2pn35GbC-vnx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugzgz82pZr_sXyR982F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy_sdzgHMC1fKSqU_h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgwqFmSDZL0kO_mPWCZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxUflKsm0fIeJRo4-d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxJ-5LGnbFLpcv7Urx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy_R7kHaKclzUb6zlt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"unclear"}, {"id":"ytc_UgzJ6DACcL7lp2g-t0F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwpGJb7fWQJIO4-9ch4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw-8aI163bDcR-ZBel4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"} ]