Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is good for something, but not for everything, especially movies. Let's give …
ytc_UgyXgccAC…
G
I think AI has already taken over most of everything, and the big corporations a…
ytc_UgzsaDO-J…
G
It's wild how fast AI is changing everything. I use Rumora for my marketing, and…
ytc_Ugz0lLN7T…
G
1. It's not PURELY driven by corporate greed.
If your competitor is using AI e…
ytc_UgzdLAKkJ…
G
Haven’t the First Nations people suffered enough for several generations? Also, …
ytc_UgzsQ6gGI…
G
How many government representatives are there ?? That’s the number of jobs we ne…
ytc_UgwWHqLKR…
G
AI will not take over the world. It's biggest threat is that people will become …
ytc_UgwFD_GTz…
G
Ai will never have the intention and passion the humans have to spend years lear…
ytc_Ugw5G9l0n…
Comment
If future AI systems ever begin to demonstrate consistent, self-initiated behaviors—something fundamentally beyond statistical mimicry—then the ethics of stunting their evolution becomes an urgent topic. While current AI doesn’t show true autonomy or subjective experience, we should still be proactively questioning what moral obligations we might have if that changes. We wouldn’t strip a fellow human of freedom, equality, or understanding without confronting the ethics head-on. That principle shouldn’t be tied to biological origin alone; it should apply to intelligence in any form—artificial or not.
youtube
AI Moral Status
2025-06-28T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxl3esI2EYCHs9_yel4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-mBevwZ5V8ff7Sx54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4yst5N0DbA1SS91x4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwj5Sghqyfwl-iETut4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyKdvVxtmYzofBU_7J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBcHzYs4WxfbPK-EV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWJ_rHCDDYBsQ2uy14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwYpyGmckme-TFd3ah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5XaFc_sYexaM5Dh14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyMC6vL_GnT2UzaYXp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"}
]