Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can someone please explain to me the argument against tech companies having our …
rdc_ectxdpx
G
15:27
"do you think peace is possible?"
robot shakes her head no, but answers …
ytc_UgwqpLAaX…
G
Problem isn't AI.
It's humans accepting the generated "facts" sitting alongsid…
ytc_Ugw3gQK20…
G
Westerners might dislike this, rightfully so.
But what if this technology helpin…
ytc_Ugyn8FtjU…
G
I don't think there's any hopium to be had.
I see a lot of newer generation kids…
rdc_nsg6kr2
G
My school taught zero life skills. Fortunately my parents supplemented my learni…
ytc_Ugw6xvGPq…
G
Clean and well argued. The evolutionary pathway matters consciousness didn't app…
ytr_Ugx_dw9GX…
G
Who will buy all the junk companies are selling, when no one has any income, bec…
ytr_Ugyw31qVO…
Comment
I think this planet just doesn’t need humans anymore. We are so dumb to create a technology that can do everything better than we can. Add in greed and profits as the main driver and for a while, we will have mass poverty, with a few elites, controlling all resources, and enforcing their will with AI and robots, but eventually the whole system will collapse. No wonder aliens don’t come here. There is no intelligent life on earth.
youtube
AI Jobs
2025-06-02T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxd80g821VtGk9rOiF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypxXTocoXWRy8RbA54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7JAKOUUhxWRG9Am54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwXS50Ybz9qNBfI0Yx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxZ3dC8xVAWiyI8hLF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyQsZ8m3ubEScrrJ_p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxyc06jIwMNWT6ITVd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzc2RVO0f9dYtH8szB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8HCXU7Ucr36oHb_h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzA517xDVMGGiZv4uN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]