Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So I don't see many people discussing the concept of 'purpose' regarding AI. Thi…
ytc_Ugzasdi7k…
G
There are so many concerning things about this. First of all, giving a robot a g…
ytc_UgzqkX4NB…
G
The other thing that is missing from these discussions is, what is the actual fu…
ytc_UgxnUiSJN…
G
Replacing senior leaders is actually something that AI is well suited to do.
Wa…
rdc_m2a2k6u
G
Though I hate how toxic some people in the AI community can be, I'm still on the…
ytc_Ugw6mJ3tj…
G
Bro calling someone an “ai artist” when the most amount of work they actually do…
ytc_UgzyqF6CX…
G
That's insane for the first time in his life Elon Musk doesn't have an answer to…
ytc_UgzlEPGB8…
G
@andrewgrantcomedy
I intended to refer to the companies developing AI, not the…
ytr_UgynrwkZi…
Comment
What do you think is missing in the current AI ethics discussion?
For me, it’s the deep philosophical and existential dimension — the protection of human essence, individuation, and authentic self-overcoming in an age of algorithmic optimization.
Drop your thoughts below. I read every single comment.
And if you’re looking to discover your own Archetype and how to protect your Human Essence in the AI era
→ take the free Career Archetype Test https://www.thesmallreset.org/career-archetype-test/
Your depth is worth protecting.
youtube
AI Moral Status
2026-04-23T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyapTP4_hx4QcrYjd54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGsb_77uPcxYStb1l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOmT2KHic54-oGmjd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGzaFeAI9tAwCKFr54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwDfaFGCOu069Edaa94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGOOI7YvKnR7ZZ0o54AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxSzRHlF0fKDiiytyt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMv9wsTXn-rjPApiJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzewOqwiGgp2Sp3P1h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw6lEMQc-E2j5OVFwN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]