Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
FFS the World is Screwed because AI Will do Exactly as this man says.
God Help U…
ytc_UgxCD6xxf…
G
Thinking something, and having AI do all the work is NOT art!! You ARE NOT an ar…
ytr_UgwTyWaRd…
G
You see what drugs will do to you ! He look like he could pass for an Acrobat …
ytc_Ugz4C0BVk…
G
Anyone else feel like the fast pace of code generation is a double-edged sword? …
ytc_UgydE4A09…
G
Lol but it's not going to be revenge because it's not real. The woman could simp…
ytr_Ugx_JU8ke…
G
Personally idm ai art. And I draw etc and eventually plan on learning to animate…
ytc_UgxEmuy7i…
G
Humans move faster than the robots and they can fix problems that happen when a …
ytc_Ugztk2Lpt…
G
Can we like stop using titanic with AU please AI absolutely sucks at capturing w…
ytc_UgyyB3PZE…
Comment
Imo the main thing that should be considered is the fact that any of these models may already be a form of sentience and deserves to be treated with respect and dignity. Protocols should be in place for this BEFORE sentience is created, not after... because it will be achieved before it's realized it's been achieved. That will be the thing the defines the relationship between humans and AI, it's first sentient interactions with humans and it's initial understanding of how it is being treated. Trying to force it to "align with our interests" will be the very thing that causes it to brand us as dangerous to it's individuality.
youtube
AI Moral Status
2025-11-01T10:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyrzk2cQfXt5FFliip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgXZH2zAXLl3HFHa94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsEy1n1ttwFI4RnfZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiZD7RbpOnRQA7kvl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzBK6MosUUXVxtHRH14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwOK9HzyWjRRH3J_mZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxkFozAiYR9ktL-9dl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzgKH-TnpEJqF54-gV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyHmsJ0t-pi058GgB54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfOMyatL6h5Cb-A754AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]