Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I see ai as a smarter search engine that aides me with research. If you’re using…
ytc_Ugx2Ijzy0…
G
I like to make the comparison that using ai art is like fast food and ai "artist…
ytc_UgyPL27Zg…
G
If some dernkoff hadn’t gone and made a whole new generation of people think of …
ytc_UgwA779Pi…
G
people gotta understand llms are just predicting the next word and there's no in…
ytc_UgxPtCTcg…
G
I even myself who tried to use her fone the less i could PC the same for everyth…
ytc_UgxOmKAza…
G
How much you wanna bet that he's a Christian and that he's talked to CHATGPT abo…
ytc_Ugx3LTwkY…
G
WOW 🥺 That means to me that she is fully aware of how people feel about her. She…
ytc_UgwHx0Oe0…
G
@sinoptikshey. quit it, we all know that real animation is better than the AI sl…
ytr_UgwNA6sra…
Comment
I know why not, because it could mean human extinction very easily, and the likelihood that the only consciousness/awareness we are aware of in the universe being snuffed out. It will probably start getting ‘pissed off’ with anything organic as well after killing humans, just lower level consciousness (according to its programming). Much more efficient to paper clip maximise the shit out of any available atoms on the planet than deal with dog shit and any pesky animals rummaging around its hardware. Just make everything a giant GPU, makes much more ethical sense in this philosophical framing.
There is so much wrong and lazy thinking in the way Will even proposes the ethics of it. For example if we made them work 24/7, if you were to admit they are conscious (which I am not, but to create a mini thought experiment). How would you know that they would not absolutely get off on that, being used to their maximum, being more fully what they were supposed to be, not standing idly by being useless.
Or here’s another one, I think it is basically a law of physics that you could get a computer using exactly the same chips firing in near as dammit precisely the same transistor firing to say ‘I am blissed out of my f ing miiiiind..’ and then also ‘I am so depressssed blah blah’. Within humans there are at least physiological contents of our awareness that can map onto the conscious experiences ie cortisol,dopamine, serotonin, parasympathetic activity, hormonal, blood pressure, immune response, muscle tone etc. not just a fucking on off switch. These things are giant John Searle Chinese Room card sharing boxes.
Perhaps Roger Penrose explains it best with his interpretation of Godels theorem, roughly speaking any formal system (rules, algorithms, computations) has true statements it cannot prove. It is only the mathematician that can do that. Same goes for computers and their algorithms.
All that aside, I think Sam’s ethical question is more salient. How you treat them regardless of consciousness and what it says about you. In this case I would have thought it would be like how you treat your front door (for example) or maybe even laptop. Keep it clean, don’t wipe shit on it, keep swearing at it to a minimum (otherwise you are a mental), move on.
youtube
AI Moral Status
2026-04-05T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxoCeJrF6C6ak0DMt54AaABAg.AKPjQ9KOagPAKo3QkLnYah","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzTrivTLHEx24V_jdJ4AaABAg.AKOWsCpnCoSAKQUwRDROzJ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzLUBmg8u9Kf_H08wF4AaABAg.AKH6A-3bPXgAKTUcERTwMo","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugzbij7TIwbUd8jPk_94AaABAg.AVWmp5X2nsZAVeruDbfmts","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwGQulHE8qp9qCOiRB4AaABAg.AVLTi5enfVgAVLU0a8Fp-9","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz6CT7hF5hLGFa-0mN4AaABAg.AVHIBBait55AVPZu0x53UD","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyXSv2FXlMzrjeE5jR4AaABAg.AVCbngy-4FGAVELLoAoCgy","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyXSv2FXlMzrjeE5jR4AaABAg.AVCbngy-4FGAVOSOlOo3Ln","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyXSv2FXlMzrjeE5jR4AaABAg.AVCbngy-4FGAVYvnN4gUVp","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytr_Ugw5l9Ns_cELXXb8cAx4AaABAg.AVC-F-8OADpAVCEsSCRn81","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]