Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's straight up theft. One would think these companies would be more responsi…
ytc_UgxnJrWrG…
G
Come on AI 🤖🎉 if I couldn’t be successful then I don’t want anyone else to be su…
ytc_Ugx4qE7Zh…
G
People using these remix / munging / mash up machines, really don't understand, …
ytc_UgwIPqdn7…
G
My contribution to helping train the AI includes Balderdash, Dagnabbit, Fartknoc…
rdc_kr8s3jr
G
As an let's say " ex artist " , like I have legit multiple diplomas and went to …
ytc_UgyW43-tu…
G
It is in our nature to destroy ourselves! Making AI robots is a sure path to des…
ytc_UgzUPgoQw…
G
Honestly, I think AI art is just plain scary as an artist, and when I first saw …
ytc_UgwbQWzIF…
G
If u are going to use chatgpt to diagnose car issues or cooking or ur health.. m…
ytc_UgwnKUIUx…
Comment
While i understand and even agree to your concerns about AI and how it's used, as the examples you gave are certainly egregious, I feel that AI is simply a tool. A tool that can be devastating in the wrong hands of course but a tool none the less. Just as a hammer can be used to smash in someone's skull and murder them, it can also be used to build something amazing and beautiful. The problem isn't the AI itself but the people using them for bad reasons.
Note, that i am in no way condoning or defending any of the horrid things that people are using AI for that you bring up in this video. Those are shitty people. But shitty people are always going to do shitty things no matter what they have available to them.
youtube
Viral AI Reaction
2024-10-23T04:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugzx6S7b8AfrxMJMFtt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_UgycVl9LKPeNDPwNgv14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgwuRDqE4m2UdAESsyZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgybBYenc8T5bNyhNfF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwR1drR1MOa1axBfl94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},{"id":"ytc_UgwPkwDTesA-R9avibN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},{"id":"ytc_UgwYznR6O0KZP4xJK-B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},{"id":"ytc_UgxVRDw-i2dwrfb30ZN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwsFhRKYMOdx1QJvMh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwoUSgPk41T5JSGqot4AaABAg","responsibility":"user","reasoning":"unclear","policy":"industry_self","emotion":"indifference"}]