Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The best solution is to SHOTGUN ALL surveillance cameras nationwide. I would al…
ytc_UgyesAHIy…
G
Nah fr bro, AI art actually sucks, can’t even recognize what you look like 💀…
ytc_UgykQJEQp…
G
Imagine the look of the first guy who buys an AI women and she leaves him lol…
ytc_Ugwm1nCRi…
G
Nothing is hack proof- not even a autonomous truck. Highlights this comment and …
ytc_Ugy4VMaSZ…
G
One thing that I keep wondering about--if everyone is unemployed, who is going t…
ytc_Ugy9qxcfF…
G
What's increasingly clear is that America's cyber security defense is no match f…
rdc_ld5mkg7
G
AI narration is also prevalent. Soon if not already, politicians will be using …
ytc_Ugz8XQtsj…
G
When your doctor tells u i need an ultrasound to diagnose your appendicitis tell…
ytr_UgwBMWRf9…
Comment
I'd say yes, if an AI can basically think like a person then it becomes a person to me then it deserves at least the right to know why we do the things we do to it and whether it wants to do it or not, if they can't "feel" like we do then at least give them the right to avoid as many risks as they can. The right of self preservation should be universal no matter if the being is code or blood.
youtube
AI Moral Status
2019-06-20T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyOJfYxKPGjB1Ix7KN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVNtk5A7TC4hfNsjF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwoZKwVbJjpNDmfwNV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwu2PuzXEwwlro1exp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxHc4Um5b97zb_H4hZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwI2KA_rxa7MaIy4eR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzp0W5FNyMcz-M74Gp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzM0zeb2XqYwQZ661N4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRq0G8ckN4ASLxiNB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzoLwzNi9nE5fJ_-u94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]