Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah... But Deezer only sees black&white when it comes to AI. They ignore the re…
ytr_UgxFZa9sD…
G
At least when a person uses someone else's work as reference, they're still putt…
ytc_UgwvjR2Zc…
G
I take it you haven't seen Husk IRL's experience with trying to get ChatGPT to r…
ytc_UgwfYpMJl…
G
Tbh even though it is ai art it still looks really nice
(I am in no …
ytc_UgxR6KOqE…
G
A good way to tell between ai and human is that unless you ask it to do so, the …
ytr_Ugx9pr52c…
G
i don't think you're taking into account that there isn't enough fresh water to …
ytc_UgwkQYFtg…
G
Here’s the flaw
If all jobs get replaced by A.I
Then no one is able to earn mone…
ytc_UgznRdQ1o…
G
They're making it sound like we don't all have free access to AI. Well we do and…
ytc_Ugxi5yOXN…
Comment
Billionaires hate employees. Employees want livable wages and safe working conditions, and may form unions to get those. Unions mean individual workers can't be bullied by billionaires. The union may join with other unions to take political action to put limits on billionaires.
Result: billionaires will try to have no workers at all. No more restrictions on them, and more profits because they won't have to pay wages. Bonus - in states like Texas, the government will send cops to beat down workers who protest.
Other result: When almost nobody is employed in a capitalist economy that needs consumers to make the economy thrive, the economy collapses.
Yay, automation! Great, isn't it?
youtube
AI Jobs
2025-05-28T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxHQ5ejoBVnaU8nAHt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwgY-RsgM02TN4jsOF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxcQyZ43LF2uS64xzF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwlyqdmmgadM0nuWdN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgyOiNtAcaXHIzT-ZCh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgznIxqg1bLTPA5SMyZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwQQ-OLKkI4wJQx1J94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyVBQ82fcf6nBkPpVR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgwQeGgnTH1rhWh1GGl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxRtapauDH68WUaPbp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}]