Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Can we be done with the simulation hypothesis please? If we had the ability to simulate billions of human consciousnesses for the purposes of entertainment or experimentation, why wouldn't we? Maybe for the same reason we currently don't grow billions of human children in labs to run experiments on. But what if the engineers have no regard for human consciousness? Then why bother simulating it? But isn't there a chance someone would do it anyway? Maybe, but give me one actual reason someone would. What value would it provide? When we simulate the weather, we use math. We don't model every air molecule. When we play a game, the game doesn't render the entire game universe at all times. It only renders the parts that the player is interacting with. And here's the kicker: I don't think anyone actually believes the simulation hypothesis, because even people who say they do live their lives as if they don't. If you were living in a simulation, why would you assume that your environment is just as real as you are? When you play a game, do you worry that the NPCs might be conscious? Do you think there are actions you could take that change the main quest? Why would you worry about climate change, nuclear war, or an AI takeover? The engineers control those outcomes, not you. Unless you live like a sociopathic hedonist, then I don't think you really believe the simulation hypothesis.
youtube AI Governance 2025-09-07T15:4… ♥ 2
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxD_WFRH-xVI6pjN9l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyF66rpLe8tV8dheLV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugzvk74pFFrRdzBRh3J4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy2eKZ53VYVctMzb1h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwzpQxVHYJ4Oz6RW694AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyRTTgOlYaC4i-RCKJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwYnTCsfjOgpkQLXPx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzHtZzdAnswduPlggR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxOCA8SDjsz88oO2Np4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzG_YqBDT5xHB8HhRF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]