Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a new technology and it will take 5 to 10 years before it can improve. The…
ytc_UgyvJJnhR…
G
@katanasharp2866 🤦you apparently missed the _"so you can claim the AI copied _*_…
ytr_UgwDaErxL…
G
2:29 you actually can. It would just take a *lot* more effort than it does for a…
ytc_Ugzsi0g0U…
G
You known poisoning the AI makes it harder to poison, right? It's called adversa…
ytc_Ugyym-wbL…
G
OK so his brain was already cooked before he met the AI is what I'm getting.…
ytc_UgxMuzm2e…
G
There is literally no scenario where ai works to humanity's benefit. Just like …
rdc_m79wdn9
G
The presence of another human being is what differentiates the two. It's the sam…
ytr_UgzCVL2Sn…
G
ChatGPT. I never like the company as a whole on its updates but my god. ChatGPT …
ytc_UgzpwJXvb…
Comment
I would say that if an AI were able to demonstrate cognition, develop emotions, learn without programming (teach itself), and develop sympathy, yes the AI should have rights. Not the same rights as a Human, but more like the same rights as a family pet.
If the AI is just cold logic and code, no. It is only following the "most accurate" information, which is not always the best. For example, a child with a rare heart condition would probably be seen as an expense and a risk. Allowing the child to die would be easier to deal with, but we Humans know we should do everything possible for the kid. Money is just money.
youtube
AI Moral Status
2019-10-18T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwLoH4r-_C-bwcKT9t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykpQhtI4VfZwDD32B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXYIhJ5TqJGjb6P_t4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxsen9MFdk5VFr4Id54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwaAR9RDEtd6jbJKMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVQSGJ4J0tBxE_Lq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHIP-3fwMHd82F_EZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxySr3f0Ri-c_WxddN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwWz7BZy3P4xjYKedx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxxs1AKEfSeMn1HWMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]