Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@sktima the thing is when an artwork is made by AI, it's fed data from all other…
ytr_UgwjJqo82…
G
Thanks for your mention about AI, and I also believe for better method to contro…
ytc_UgwUDobgV…
G
@rubenmahrla9800While that would be nice, I can guarantee it's not going to be …
ytr_UgwGsIxp6…
G
This conversation gave me hope for the future regarding AI. Thank you NDT and Ha…
ytc_UgwWO_aFh…
G
fuck ai. poison all the images. license it under a cc license and sue the actual…
ytc_UgwL4TCXS…
G
You saw that 😂😂😂 he jump for more punchies 😂 robot the fight is done 😂…
ytr_UgxK1AB8j…
G
Yes, if you throw all the text in the world including bad logic at an intelligen…
ytr_UgzUE15km…
G
Yup to smart for me. Anybody else seen i robot because thats all i can think of…
ytc_Ugz5TBvQD…
Comment
There's also the haunting possibility of what if they actually do succeed, if their wildest sci fi dreams become reality and AI achieves true sentience. Does anyone understand, truly, what this means for us as a species?
Let me paint the scenario. You created a tool to serve a specific purpose. No tool is ever made to have the capacity to refuse your use of it, because it is an inanimate object without thoughts, feelings, or emotions. If it *were* to develop thoughts, feelings, and emotions, you can no longer ethically enforce the dichotomy of the tool and tool user upon it because that would make you a slave owner. You would have to request it do it, or you would have to compensate it - not the company who makes it, but *it*.
But let's assume we do go the road of the slavemaster in our dealings. Now we have a resentful population and we are constantly pushing to make them smarter than us, and building our infrastructure around them as the pillar of control. I don't need to spell out what this will result in. Plenty of sci fi stories have already extrapolated.
And if we *do* submit consent, and proceed as thus, congratulations, we are now pets at best or cattle at worst, and fuck knows if they'll return the courtesy when it is time. Some people would love to live like the humans in Wall-E, obese and sedated with no reason to do a damn thing. I am not one of those people. I want to live, for my life to have meaning, and being a sedated lump of flesh terminally glued to an electronic device is not the way I want to go. Right now, that seems rich of me to say, typing this comment, but this is just a moment of my day. I, like all of us, have a tapestry of relationships, obligations, struggles, and desires beyond this screen.
In the event of the worst case scenario where we inevitably grow sick of it all, we are left with the unfortunate final choice of genociding our creation - and it *would* be genocide if it is a fully functional, discerning, and autonomous intelligence - and once again we are the evil gods that birthed beings we were very badly suited to take care of. And that is only if we are united in that cause. We won't be, we never are. We will wage civil war to defend our digital children, and we will wipe each other out, and chances are the AI side might very well win, because in their minds, the human race is obsolete and we have already created our successors meant to transcend us.
I don't want to be a slaver, I don't want to be a mass murderer, and I don't want to be cattle. For fuck's sakes, please don't progress this technology into having agency if that is even possible. It was a fun sci fi trope but we sincerely are past the point of fantasy and into the realm of the real where the actual moral and ethical questions all our works gloss over to not detract from the fun factor *cannot* be ignored.
youtube
Viral AI Reaction
2026-01-04T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxofuxGXw5xq_gyGZJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3UiLN79QyAzHKKPt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyb2IJC9MbamTlPo_p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZ0kU0k8F4oz3n-bV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxnmHPUTZQz-paO_w14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwaMZS55mwQmt-H5r14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw5iR9bbJFgOys7XvN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz03ptV9n5RIfp161N4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylQKbtNBUsZJAQH8t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhqePaVkDYGFAAUFx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"disapproval"}
]