Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m not sure that’s true. Sure the AI companies internally are ahead but that’s …
rdc_o80oeh3
G
and yet, none of them looked better than the ai
that's the world we're in…
ytc_UgxOKfxfE…
G
Consciousness precedes biological life, ai can’t become conscious but a pre exis…
ytc_UgxiUBIJQ…
G
dont be fooled. AI isnt a being. Its a TOOL created by HUMAN. What ever AI will …
ytr_UgxvCUVhT…
G
Not exactly sure what you mean. But can you name a person alive right now who is…
ytr_UgwqkUO2p…
G
Very important statement.
This will certainly be regulated soon, as other claims…
ytc_UgyvzEHe0…
G
LLM-based AI is mostly just a mimic. Garbage in, garbage out. It has been fed al…
ytc_Ugwbux0NN…
G
This is super interesting but I would be curious to know how the “automated bots…
ytc_Ugy3YIGyD…
Comment
AI is shit. It's a shitty, buggy product that steals from people so that it can do a poor impersonation of things humans do much better. Its data centers damage the environment and the livelihood of those in its vicinity. People buy into its quasi-mystical aura while it tells them they're wonderful and smart. It's ridiculous that so many people have jumped on board with it, but that's where we are as a society: we are fucking moronic. We fought in supermarket aisles over toilet paper during Covid, because we feared a made-up toilet paper shortage... ironically causing such a shortage ourselves. We are fucking stupid.
youtube
AI Moral Status
2026-04-04T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxChZcT5DypxTuTP0l4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwomT-yqSjhoAfl6Md4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwy1xW9XIFnd8KhDr14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_y7MSkbeAlFM12G54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAKFF0Naipasxxpxp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyarwZHn0HTMsn3Alp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwet255iJTwiw0bItZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwfX6p3X1avCHEMT7h4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8PTiJuxDHprxHIV14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzH3PK1lAuA2nueQyR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}]