Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yup, all of this will happen. Biological humans will be confined to Earth, or at…
ytc_UgyRXRpG2…
G
Please could you explain me the problem. I still don't think deep fakes are the …
ytr_UgyD4yhRh…
G
bro wasted like 400 gallons of drinkable water and 600 gigawatts of energy just …
rdc_n0gouz4
G
💻 Hey, it's me AI, this crazy old fool has no idea how quickly I am going to wip…
ytc_UgzJyuT4L…
G
Boss: "We need you to train an AI that looks, talks and acts like you." Two week…
ytc_UgxZ2yLNt…
G
He is right in a sense that we need to regulate before a problem has already hap…
ytc_UgwZntqPG…
G
These guys 2 years ago had no idea what's coming whatsoever and no one knew the…
ytc_UgxOXQ3yz…
G
Thank you for sharing your thoughts! It's true that many people value human conn…
ytr_UgwQdkr2f…
Comment
I am scared of AI for many reasons.
I am scared of it because it will take most jobs, and reduce me to poverty.
I am scared of it because it is not human, and therefore is incapable of creativity and empathy, and yet is already being used in fields where those things used to be essential.
I am scared of it because, well, the powers that be seem to love it so damned much. Nothing that a government loves so much that it is willing to put it ahead of human voters, as in Texas where the AI centre is leaching power from the local community, can possibly be a good idea.
I hate AI. We all should hate AI. We must unite, we must fight against.
DEATH TO THE CLANKER SCUM!
youtube
AI Jobs
2025-09-06T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwxmpC-ihNhwhzUUV54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugxc6Q_mLUZPwgYRqLh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy8LeCS6AJHkwBGuTp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugwur366SjOoHeXqmkV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzTuSWDMb7Ahy8i2HR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"approval"},{"id":"ytc_UgxOO4Jws2HDu8Nuvpl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgygBX7HUNTHvIOHHLV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugxx2AjTLKsmIqkrN_B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgzskWmiqDdj6yxy38l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugwt9Zvt5m7p3e1nzpR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"outrage"}]