Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you might want to check out Ed Zitron's podcast if you haven't yet. at least som…
ytr_UgyWi0NAd…
G
Let me start with saying, I hate AI being used as a "push a button and get a res…
ytc_UgyKBtXaG…
G
I love how ypure having an arc vs the AI bros.
Kudos to you because youre winni…
ytc_Ugx0ZuykR…
G
I think this was a reasonably nuanced video but feels kind of willfully blind to…
ytc_Ugw7RqOQY…
G
@Mendogologysome countries allow ai art to be copyrighted as long as there’s hum…
ytr_UgyaoOAQU…
G
So, basically, you have nothing. But need to make up BS in order to not let the …
ytc_UgymtGYNW…
G
Here's my theory: as AI is becoming more and more prevalent in today's society a…
ytc_Ugwjy8qOH…
G
experience crossing multiple areas of concept - this is the area where AI has is…
ytc_UgxAaqPml…
Comment
I hate AI. I hate the people who made it. I don't care if it's rational, I want it to fail. I want everyone who invested in it to fail. I want the people who love it to lose it.
AI is evil technology that exists to replace humans. It is technology that exists only to benefit the wealthy ruling classes. There IS no good ending to AI.
People are already losing their minds and murdering their own mothers cause this tech told them it was a good idea.
If you support AI, I DO NOT support you. More so, I regard you as an enemy, a bad person in every way who should be opposed.
youtube
AI Jobs
2026-01-01T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw5goZXzu--LaifmGx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzJAE_pkszJaNiDhCV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwrziOXjOIXCL037jR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxGr11Y_jn3OQATgVF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzvuv_XQ6FrpZx0dK94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxTP0m5p5lUv6DZXwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzu58v7ke4d5kxUmvV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwtbAlNbZ0j_c-sIWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyeY_0dQMqQ_EKiNAR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1RQe0AmAG05VYKDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]