Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The idea of a future where AI makes life better for everyone is exactly why im s…
ytc_Ugy_psmlR…
G
I feel like a war of ai art vs our own art is gonna happen (if it is ima join it…
ytc_UgyGcZYZl…
G
Definitely looking forward to AI and machine Learning replacing doctors as much …
ytc_Ugwcr6l59…
G
Wow, I’m early lol Not surprised deepfake stuff with AI is getting out of contro…
ytc_Ugwa9wq2M…
G
This is why AI should not be something we depend on. AI is the bane of our era.…
ytc_UgztbRcoL…
G
The seeds of humanity's destruction are quite likely in the primal and violent i…
ytc_UgwI_MNLK…
G
As an amateur digital artist (self taught since 2018), I was extremely against t…
ytc_UgxIYUZ9z…
G
I feel like this will make CEOs think that people want ai
But god it is too fun…
ytc_Ugx9bTVPT…
Comment
As a disabled person I do NOT want AI doing all that in healthcare and it's disturbing to me that people want it to... it magnifies the biases of what it's trained on and he acknowledged that doctors mis-diagnose and that people get discharged too early, so we're going to feed all those averages to an unfeeling machine and let it make the wrong decisions with peoples lives for tens of thousands or hundreds of thousands until it starts to learn?? It's not a game of go, it can't run the choices a thousand times per second or whatever, each case that would be used is a person's life. And a life that it may or may not get trained correctly on because if the person gets discharged and DIES it may not be something that ends up recorded in that hospital. At least if a human being makes a catastrophic mistake there is a real person that can be held culpable
youtube
AI Moral Status
2026-03-01T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxZwfTj6L93bw1nAqR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwqL6Ev_v4_JZAN09Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwc_F6-DJ4D_zIqE4V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNqtDsSNEQv-ivtQh4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyIXRwPKVEu9-9Aga94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgywNPK80_o3ExG9xwR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxHf3pYQXWOCeiouW94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwak4oN3WuFJIxTEJx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzSB51RlpaVQX3gfCJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJyBVTXsR5ykqmkr54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]