Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I genuinely paused on the "I generated an A.I. art, now I'm a *real* artist", an…
ytc_UgzcX2Ztr…
G
"I want to become an artist... Ah yes let's steal work throughout all my career!…
ytc_UgwTtmouo…
G
Why cant it be set parameters or directives that it does no harm to humans and t…
ytc_UgyMvnECA…
G
The fact that these AI bros don't understand how Nightshade works, and by extens…
ytc_Ugwqh2T_w…
G
of course you can contain an AI . You just do not give it access to the internet…
ytc_Ugw8LIiqe…
G
C'mon, what's wrong with character ai? In it I can play whoever with almost anyo…
ytc_UgwlUTGhC…
G
Honestly? Im hoping for a collapse via ai failing. Maybe then we can reset the g…
ytc_Ugxj-Kkd0…
G
No matter how fierce the AI war will be, China will always lose, that's because …
ytc_UgwWxS297…
Comment
I worked in product for several big companies during the genesis of AI. We did a lot of testing and quickly found out that it wasn't worth it. Not for ML, not for the agentic stuff. Teams were building traditional "if/then" code and using standard data modeling and then slapping a chat bot on top and calling it "AI". When Trump started dropping the bottom out of the market, I think a lot of the big companies leaned into "AI" as the reason they were dropping employees so they didn't have to point out Trump's disaster economy and risk his wrath. The smaller companies, thinking the big companies were really using AI, all started dumping employees and "going to AI". And here we are.
youtube
AI Jobs
2026-02-06T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzFstQ7vRPt6BkG6hF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2MDWZ8yT1R9GL8ft4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKxShHuLKudeGPsZR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxPL4Vm7GzboEEVBmV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxs-nIzwhCVud4ifB94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHqVEKCzrqhl4Ni0V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzt5smlNcKLEA6MMtl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwkFHnKzfNFXU6HtLt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-E_hPryohrHjmIkt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyKvC19X9gGMPPxeUN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]