Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai art usually has no meaning has horrible mistakes just looks ass deserves to g…
ytr_UgzeApflw…
G
Non-organic intelligence has existed from the beginning. AI, like every other wo…
ytc_UgwiSJET9…
G
Hey! It's AI "racism" that's so common in science fiction. Can't believe it's al…
ytc_UgyXm-R0F…
G
AI is a tool.
Not a person.
AI has no fellings or needs.
People do.
In short us…
ytc_UgxwfbgLp…
G
How about we just don't use AI for this shit. How about we just don't use AI at …
ytc_UgzdcsBpr…
G
the easiest argument that proves a difference between Traditional/Digital art, P…
ytc_Ugzhn0DCb…
G
AI for sure way smarter than human 😂 how hilarious human said could be 😂😂…
ytc_UgzRq3_dE…
G
Mis leading reporting. She said the cars camera failed to recognize a stop sign.…
ytc_UgxnQ3HCy…
Comment
If this was the only reason for the strike, nobody would argue with them.
The problem is... it isn't. And it kinda shows, because the first show I know of to use AI was secret invasion. And that's not even close to the worst thing made this year.
If you make better stories than an AI that still fails to extreme levels, you can strike for whatever pay raise you want. But if you watch velma... then watch Riverdale... then anything in the arrowverse... then she-hulk... you kinda feel like a robot already wrote all of these shows... and it wrote then BADLY.
But yeah, let's say writers MUST be human and actors MUST be human. Great. Now humans just need to actually do their jobs, instead of being political activists with side gigs as writers.
youtube
2023-10-18T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugys6dFDBh9sXuvjvol4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjfOJm2TX_bFYCG9F4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwpRLPTUniy5zAiT94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXJPPpFwi07_IdVsF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBUyzXtBR5JIKkVz94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzeoKf9k34YW2uy0mB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyG8jJ1OU2r3yPU2Tx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxFOMneEM4WbOCEVJh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxQOJnY4vX75xY4wl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0QH5emgBRII38SZl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]