Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Spoken like a businessman?? "Of course, we have the technology?? but first, we'r…
ytc_UgxbpfV6G…
G
Most people already don't know their ass from a hole in the ground.. .a.i. will …
ytc_UgxnMfc_t…
G
@P4trickSt4rmieit’s their personal account but they are a worker trying to refer…
ytr_UgwumHzve…
G
The real reason: self driving cars sit at a stop sign forever because they’re de…
ytc_UgwhBbOGJ…
G
As a stable diffusion user, i never had money to burn on art in the first place.…
ytc_UgxxVAEoL…
G
You think 95% of this sub (being generous) even know how to use an open source L…
rdc_m943lll
G
i just gave it my worst essay paper and just asked the ai to write with how the …
ytc_UgxjBWlaM…
G
This shit sucks, but don't forget, painters never stopped painting after the cam…
ytc_Ugy_Rben0…
Comment
Training AI on legally owned data might be ethical, but does it solve the issue of displacing writers/artists and musicians with cheap labor's ultimate form?
Additionally, all any legal guard rail will do (and have done already in the case of Adobe) is make companies write terms into their Ts and Cs that enable them to legally capture and sell even more of our data for massive profits against our wishes.
At a certain point, we need to acknowledge that the application of the technology itself is the problem and that the people being targeted need protection.
youtube
2025-04-17T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyM53xDei1EEIMxJIx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx4S8qY-SmnKz8vlWV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4Ks0gLeXHekKux_F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwFm4yVYfB9Y75oh6h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxOiCZCDm3xULFSSO94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz3zT6gqRqepGeuyf94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNe-QsDzF6DJwOpK54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzIRVzFHWkpY01GzD94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxuZdWbM9Z3TqIXJYd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwl-aKrMxrW87NbwG94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]