Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Almost all mentally and intellectually working employees are burnt out and done …
ytc_UgxIs4pG6…
G
This is unstoppable. Driverless trucks in the near future might create a demand …
ytc_UgwanV6mp…
G
Ya know, it's sad none of these "scholars" understand the true nature of AI and…
ytc_UgxTsiiW_…
G
Does this mean we can now pass laws that only apply to people born before 1970? …
rdc_oi2za8l
G
Its impossible to get rid of AI art now, Even if its trained with open royalty f…
ytc_UgyjYuCJt…
G
It is exactly the same thing as Windows 11 shifting to an "AI OS." I don't know …
rdc_nufqobh
G
Meanwhile, the recipe you didn't see because it didn't come up in search results…
rdc_nu7pceb
G
This also isn’t fair bc the atheist side went first so the whole time the believ…
ytc_UgyvyL0c8…
Comment
i get using it on your art to stop ai from training on it, but i feel like doing it just for the sake of poisoning the dataset is a bit malicious. some of the people using these datasets are just students trying to understand the technology or scientists trying to develop some cool technology, not all the things using this will be big companies making image generators. people should not be scrutinized for poisoning their own images but it is my opinion that this information should be available so that an artist can avoid their art being stolen but still allow for interesting technology to continue being developed.
edit - it this also would cause issues for image classification models which are only capable of stating what's in an image and is not capable of generating them
youtube
Viral AI Reaction
2025-04-01T12:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz-bhp8gtQq2EAQAVd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz3I4E_NVYq1v8xAAl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrSxcauGLIZpH99xB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgygV3QiFM8LlGISKgF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxzrkdbWV7ERihLoVV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugygz557KBCiVWOzkQ54AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxA5ybVRECZT7gApqt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugya8tqvP06eOR-o7FN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxSixg5VBpEoBYvYZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgydMlXcbJuJ9G_aIAx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]