Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are you serious? AI IS an issue.
This world survived without it
We don’t need …
ytr_UgzypruKf…
G
@DaJodadeasy, we hack one that looks like Arnold Schwarzenegger when he was you…
ytr_UgwtVDtHz…
G
Never ever, the scientist have responsible to prevent the robots. That's it, we …
ytc_UgxX34vJK…
G
We appreciate your observation. In our live broadcasts on AITube, viewers can en…
ytr_UgwjQLfDb…
G
When the topic of licensing AI was discussed and the professor mentioned AutoGPT…
ytc_Ugy40WZiz…
G
Racist facial recognition program? Kind of makes you wonder about the kind of pe…
ytc_Ugzrwdumx…
G
Story time. My highschool was hit with this, photoshopped and/or leaked images. …
ytc_UgwE4dDuk…
G
There needs to be a better solution to life for humans. If AI and robots can do …
ytc_Ugw01z_mx…
Comment
The problem is that this technology is completely open source, and people will be freely available to even create an entire program like this using AI itself in a generation.
Open source software itself may possibly be regulated, but it will NEVER be stopped. I mean, it's like people sharing ripped MP3 songs and pirate movies around the early 2000s.
This needs to be treated like pornography, and the people with content such as this on their computer need to be treated like people who create indecent images of children.
The problem here is that the utility of undressing a person is that it is used to extort people in real life or humiliate people openly, and is not necessarily something that is created to be "used"(...) in secret like child pornography. In fact, on first glance, if someone sees a naked picture of an attractive woman online, it would likely get shared around and liked even because NO ONE many even KNOW that it is illegal ... This is not something that would certainly not happen with child pornography ... If someone shared shild pornography online, then many people would call the police directly because it's so easy to identify. This is not the case with AI "undressed" photos, as they would at least have to "know" the person to know it was unconsenting, and even then they may likely not know for sure ...
There is simply NO way to regulate this using the law while protecting the rights of people to use a computer privately. Everyone is at risk if they so much as post their face online ...
youtube
AI Harm Incident
2023-11-25T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwIltveQweDzFIvbTh4AaABAg.9xX8p3TbQ6C9xXWPTejk7d","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwIltveQweDzFIvbTh4AaABAg.9xX8p3TbQ6C9xYmjnBt2ah","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyY8h8xh5ARxYANpXt4AaABAg.9xX3XMCzftZ9xX6mX1hO1s","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxNjtAT49cbHdREgpN4AaABAg.9xX1Lbat3w39xX2sowYzla","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxNjtAT49cbHdREgpN4AaABAg.9xX1Lbat3w39xX78Fu36Wd","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytr_UgxNjtAT49cbHdREgpN4AaABAg.9xX1Lbat3w39xX7iC9Od6l","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytr_UgyxhnJnj-EDiBuwUL54AaABAg.9xWoIx9FO309xWqpySuApl","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgyxhnJnj-EDiBuwUL54AaABAg.9xWoIx9FO309xWxlYLsRJS","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgxQQ0Cas_DEsIi-_bJ4AaABAg.9xWntWuh9NR9xWtZB0tPN4","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxQQ0Cas_DEsIi-_bJ4AaABAg.9xWntWuh9NR9xWzyixaHvh","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]