Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm a person that used AI a lot, from just LLM to image generator, at first it w…
ytc_Ugx9P1FJd…
G
this person makes me really sad, like dawg we all start out with barely the abil…
ytc_Ugxre0WZp…
G
Legit, sane questions you are asking to the future. Every analysis is like a pri…
rdc_j4xhdyn
G
I mean, it's a clever defense against AI and I like the usage of memes as extra …
ytc_UgyNSw-5X…
G
One way to argue about A.I. art is that you are not the artist. You are giving t…
ytc_UgwJgNS20…
G
I think the statement is wrong: believer vs. atheist.
I think they both believe …
ytc_UgyvRXPC7…
G
AI is just a way to steal someone elses work without paying them, this is a grea…
rdc_jwv67gz
G
There is a native american saying... speaking with forked tongue....a lie out of…
ytc_Ugyk28hZM…
Comment
Hi. I'm not an AI stan. I don't know what 'stan' means. And I don't really use AI as its not useful to me.
There is no assurance as to why these videos are being recommended to me honestly, considering I don't really use AI art for anything.
That being said; AI or at least AI art feels like early stage DA artists. They are just tracing and need to be trained instead, given their own personal examples to use maybe?
There could be a positive outcome if an LLM or something similar to achieve Basic Agency.
And I'd like to improve AI.
We shouldn't be stealing art will hep with this. But that being said;
Treating AI like a young, up and coming artist might help.
Not by artists, but by those who create the data sets.
A though; is it possible to give AI basic studies, or hire an artist to create a basic set of data for an AI to use...
An ethical way to train an AI?
youtube
Viral AI Reaction
2025-12-30T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwmYZhb_i5EtiH1eyl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwWeH60bWgbVqkaF6Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwe23oiPJbc6B0rb7d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwo8pAB71qscSMBjz94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxN_vP-ZSrEDqvAHwV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxm5chAF7Or4Xyp4yx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjykeuSBDeH0q_FGF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxosB1FBi2fDWKWCL94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwZ_j-sPAViSxpSzvR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwag2TsIsqIV7D8NO94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}]