Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This can become dangerous, because we have seen what those with limitless money …
ytc_UgxTUwP6x…
G
Bruh they don’t understand that AI is drawing for THEM, using a digital tool suc…
ytc_UgyOeurK9…
G
@dannanamanna What then would be it's goal? AI is developed by humans. If humans…
ytr_Ugzc28NqK…
G
Exactly! They use these bullshit benchmark tests to show how it gets better, but…
ytr_UgwynKzaD…
G
A.I can outsource, out think, anticipate with accuracy your next thought, move a…
ytc_UgyO56Cid…
G
AI artists are the equivalent of paying an artist to commission your oc and then…
ytc_UgxVwUoD8…
G
Typical, another entertainer crying about AI. Dude you and your brother have SPO…
ytc_UgzCa-366…
G
This proves how lonely I am because I have to date an Ai character now, I’m lite…
ytc_UgwZw8BNW…
Comment
It is very sad to see how people are stealing content from all over the internet to fill up their dataset. I mean, yeah, it would have been impossible to train big models otherwise but the situation looks pretty mad.
The thing is that it is not illegal to make a dataset of, for example tweets/retweets/messages, and it is VERY hard to prove that your model used something/ was trained on this tweet/message/image/artist. All of this seems to me as big robbery and no government can do anything with it at the moment.
Sadly, it is a matter of time when this 'poison' will be just filtered out because it is still cheaper to process the whole dataset of images than paying everyone whose work is being used
Although I don't think that 90% of 'ai artists' are educated enough to design an architecture like this
Cruel world
youtube
Viral AI Reaction
2025-04-02T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxieyF0tLlmLcgFQWB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxw3wCYQ0peYxQN5fF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugwdmc-I_IRjHykrHRl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgydPQHvNEKttbXJ8ot4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiUx6fyiR1M-EDGwN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyXpzuGnYxQnDdUClB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzfPyDtTIhs2wMujyB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"sadness"},
{"id":"ytc_Ugz5nLtZmlSdis9Yvsh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxBfCCQJrMDdTpe75l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWGXlXMfXY9CNujOd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"approval"}
]