Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@skeetabomb This is something the "philosophers" need to understand. The foolishness to think that consciousness could just sprout from binary is ridiculous. We'll speed our doom by having "rights groups for AI" and jank like that which is just beyond stupidity. How arrogant people must be to think that we can create sentience when we don't even know how physical biology works 100% in hosting said consciousness? I swear it's going to be like the ending of the movie "AI". Where the world is destroyed and all that's left are these mimics of life by AI. It will be a soulless dark world with no life or intelligence - just mimicry. That's a worse outcome than annihalation if you ask me because said mimics could go on slaughtering REAL life in the universe while replicating itself. Same goes for the argument of "One day we'll upload our minds into machines". Yeah no, and same response - we don't know how it work with bio how does recording a personality equate to housing the personality in a system that can sense everything ON TOP OF being conscious? The ignorance scares me more than the AI because said ignorance will unleash it fully.
youtube AI Governance 2023-06-27T15:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyban
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzQBnzqF0jnZc0SrNJ4AaABAg.9pnOHAIC4Vi9q5x3K7fDUB","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgzQBnzqF0jnZc0SrNJ4AaABAg.9pnOHAIC4Vi9r01dWG_WKS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzQBnzqF0jnZc0SrNJ4AaABAg.9pnOHAIC4Vi9rAENfOeBhR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzQBnzqF0jnZc0SrNJ4AaABAg.9pnOHAIC4Vi9rTQToa7lyn","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytr_UgxgSTUDxiBpUVbH-a14AaABAg.9pn36qQYbC89pozOe5fTh7","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx4wPUZmkedl28p9Gt4AaABAg.9pmjEHCzwBK9q5vtsBHkah","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_Ugx4wPUZmkedl28p9Gt4AaABAg.9pmjEHCzwBK9qqF5uqfnsH","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugx4wPUZmkedl28p9Gt4AaABAg.9pmjEHCzwBK9rSlsjZ0MES","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgwcfyOegBuaEinLnEJ4AaABAg.9pmWbcEv1kY9pomlU2Ivi7","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugx-QDMPRvL9w3Cs4KN4AaABAg.AN2Jum8L7R8ANPkFSrpVnr","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"} ]