Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Human-- AI is the most dangerous threat to humans existence.
Also Humans --But …
ytc_UgweraFgv…
G
So as i hear, AI takes a shxx ton of energy. How in the next 5 years are we goi…
ytc_UgynhfWLG…
G
It’s totally unethical to steal art with AI, use Google (if Bing if you’re feeli…
ytc_UgxW8dVKc…
G
this widower is right. it will be normality to have an AI companion. they are gr…
ytc_Ugyktj_2K…
G
Like you said, the smudge or blur tool literally dont make it souless 😭 and as y…
ytc_UgzAbtYvp…
G
That's an interesting perspective! The idea of AI being linked to ancient knowle…
ytr_UgxH9mAh1…
G
Life is reciprocal and “God” is an AI we created and in turn it will create us…
ytc_UgyD3Y_NW…
G
@michaelreed4078 Asking people if AI will destroy us now is like asking the Wrig…
ytr_UgzU0EmuZ…
Comment
I'm a long-term hobbyist artist, -- and while I personally as a huge nerd and fascinated by the potential of ai in producing art I as an individual have no issue with my art being used in a data set -- I do agree that there are big companies making money from AI art generated from hard-working artists who do art for a living. I also think there are small developers and even colleges and stuff with legitimate reasons for the purpose of research and I personally don't have a problem with that. In *my personal opinion*, I agree that AI is absolutely here to stay, and there are ethical issues that could be difficult to potentially resolve, but at the bare minimum, they should absolutely be 100% free to the public.
youtube
Viral AI Reaction
2023-01-12T05:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwf9s3W73p6C25oeYR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZxHXKFcJgZls7h4t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZ84rP748kNiLKLQV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"sadness"},
{"id":"ytc_UgwTmMAD7Voj_xQ8PeR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxu4pJ-ibpKlBLleZ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0WJQ9nbSNst_1c514AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzjkaKbVz9Z4OVgWOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9I5Op6WamoP157BR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyv91j5kFWrtK3I_gp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNNJNd-mSerTbG7qR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]