Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"She can write 5x as many letters. That means they will need 5x fewer of her." False. Or at least, a boss who doesn't take advantage of that across the board to make his or her company more valuable is literally wasting their time. Literally every industrial development in the history of mankind has not reduced the need for labor by accelerating productivity, it has only increased GDP/profit. None of them have reduced working hours, either. > "What remains?" "Maybe for a while, some types of creativity. But the whole idea of superintelligence is that nothing remains; these things will get to be better than most everything." False premise. What drives creativity (besides "bounded randomness", perhaps) is knowing what is valuable. To paraphrase the great Billie Jo Armstrong, “I write music for me. If other people find it valuable, that’s a happy accident.” No AI can know *fundamentally* what is valuable (some would argue that an AI can't truly "know" anything); only a consciousness can, because only a consciousness subjectively experiences joy and pain, disgust and delight, ugliness and beauty... and those are where notions of "value" ultimately come from. Having worked with AI for years now to write code, I can say with certainty: They know the answer to almost everything, but the value of nothing. As it turns out, a very sophisticated next-token predictor that is trained on a ton of mostly-coherent data will *seem* to be smart, because much of intelligence may be mechanistic. It will seem to *echo* values, but that is only because of the training data. > "I just don't like to think of what could happen." "Why?" "Well, because it could be awful." So he can't name the monster, but he is still afraid of the monster. What would you call that sort of thinking? Baseless pessimism? 😆
youtube Cross-Cultural 2025-10-22T15:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzCuevMMJRceV4TTaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxMP9mD9t8hg6yZbZd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxDFNf8hFYyFYjQoi54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyTbRs-TgYHGUFIVZx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxaLd7Lb7hKLPc8cuh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxIITIij7Gk_ulxMsZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx60DCvqjELIpjgawp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz_pOspk3eBEiwlhct4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxpRkondPJFSaIts_F4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxe6oQ3wHdSbuU96H54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"fear"} ]