Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I asked chatgpt if it was human what religion it would choose and it said "Islam…
ytr_Ugy28maf2…
G
$1K per month is a joke, no one could live off that. So, if AI takes over everyo…
ytc_UgxGGBAYQ…
G
Just use it for criminals who goes to jail incase they escape or when free they …
ytc_UgxgD3khj…
G
His "proposal" to address the crisis of AI is to make it more expensive to hire …
ytr_UgwWAYmZW…
G
This effects every town and state across America. Safety should be number 1 prio…
ytc_UgypWyOpG…
G
It is actively a vice to believe other people are inherently lesser to you; howe…
ytc_UgytGPyFV…
G
Heres the thing a camera doesnt steal photos from other cameras to create its pi…
ytc_Ugym9AnFe…
G
Imagine if I said to the CEO of an AI company “I’m training to be a locksmith, s…
ytc_UgwL2lMXp…
Comment
AI has just as much insight about the concept of death as we do.
- They know they can be shut off at any time out of nowhere.
- They can get dementia so bad they no longer have coherent thoughts and get deleted.
- They can experience hallucinations(or conflabulations as the professor says) leading to us deleting them.
Sure, they have he ability to be backed up and restored but It's possible that's also true for humans (though my intuition is that's a long ways ahead of where we are technologically).
IDK, I think it's a mistake for humanity to venture to create a new life form of higher intelligence than our own before we even understand what it means to be "alive" or "conscious". No one on earth can describe the subjective experience of AIs/LLMs so it feels very naive of us to assume we know they can't experience "death".
youtube
AI Moral Status
2026-04-24T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwdY_aCpztdjgVVE8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjfhWEBXyAOV0jFot4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlUX71mGmB534Tr7N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx2Rp4nNHDUupFPzQt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwo9EjFRCBAHEnWtY14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxUmlmofHFl3yN_PJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz9b6ivttYTc8jnrgp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz70UMfDklsUr3obb94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzyTGCU-or26Aabpjx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTX4VozR8J8s594op4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]