Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
a rogue singularity is one 'thing' or new 'being' that might save the human race…
ytc_UgwB-QLM0…
G
Lazy and discouraged people get sucked into predatory, dehumanizing systems like…
ytc_UgxWuSjPW…
G
If you want to be an artist, LEARN HOW TO DRAW, not use an AI generator.…
ytc_Ugz9nkXJH…
G
Not sure what her idea of prison is, but an open air prison sounds better than b…
ytc_Ugzc6dW64…
G
As an artist I know that seeing somebody else’s art work and getting inspired or…
ytc_Ugwi-S2xe…
G
The point of the story is that the three laws don't work.
More to the point, no…
ytr_UgxBM-JXl…
G
I've had a issue articulating exactly why I hate AI, and I never use it because …
ytc_UgxfexMSt…
G
@xiaominsongnot the end of human knowledge but you reach the end of LLM currect…
ytr_Ugw9hNLMs…
Comment
And for lower-quality work, too. "Good enough", they'll say, not realizing how badly their customer experience is going to tank in less than a year. Has *already* tanked, even this far in. It's really not a great idea to give important tasks in the workplace to generalized LLMs stored in giant resource-obliterating data centers. If you're gonna use an LLM to 'replace' a job, you gotta have *people who know* in the first place to build it right so it does the job right. While continuing to maintain it functioning properly, just like with a factory. And yet. The people profiting off all of this at present are far too lazy to do the right thing.
youtube
AI Jobs
2025-10-08T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwAKuLID4pywo8aK1t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy3v7ii1tjOTwy5HSV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgznS5zd0trAzVenuIV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgwesjmwOSMcQTK2ZqJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_aJcqrPz51mbKLkF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwwK7lkrd3U9zCJrtt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx4GpO0iUkzWXAJIGd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHAZ_45seqYxezqhl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUzeWLOIuq1PadQnR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVQkWpwmxdjECKzqN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]