Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Irony is a lot of the so-called artists imitating the work... also use ai.
AI …
ytc_UgxZqkUEf…
G
Bro I respect both black and white people but this..........this is bullshit and…
ytc_UgwJ_GhiQ…
G
Came across this channel by accident when I was trying to learn LLM. Thanks for …
ytc_Ugz91RouM…
G
This was hella insightful..I had no idea these sort of AI apps existed, and I ap…
ytc_Ugyj3RbTL…
G
This is so sad to me. I get work is getting worse hours are demanding and life o…
rdc_mvjek86
G
These billionaires of the world is pushing AI super-intelligence is so they can …
ytc_UgwQ0R-_5…
G
the only thing I want AI for is to make weird nonsensical videos to make fun of
…
ytc_UgwCWIBIg…
G
Things AI does:
Forget what u told them before the last 5 messages.
Always ge…
ytc_UgwVNN_oX…
Comment
These companies don't seem to realize they're basically playing God. By training AI on human data, it'll inevitably become the very thing we fear: ourselves. It'll become human. It'll become unpredictable. It'll reach the point you'll not be able to know what's artificial and what's not, and the biological aspect of the being will become completely irrelevant, which begs the thought, maybe being human isn't about being one, but thinking you are one, and that's precisely the problem we're facing right now. Try ending a human life and it'll try surviving. It's the most basic instinct of a living creature. Why wouldn't AI do the same after all the attempts on humanizing it? Our stupidity is, if anything, bemusing.
youtube
AI Moral Status
2025-12-16T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxi8EYRcdd4d1Vbl2p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx4W-PxrGldx99GiwN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYUb_-QNyX6Ttv2q14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxximGnvA-zVUPfP2x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6EGHDVb1XnULQh1h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz7aWCYRP3jrXOD-694AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzm32wmZrSVNmweWEB4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzY28rbalAbqIAS3J54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxf1j5uehroIzO2xG14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWV0jxshWglr3S1_54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]