Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder if they tried to boost the brains that are conected to Neralink with AI…
ytc_UgwMPrSC_…
G
AI is fine, humans are crappy. Best example mecha Hitler Grok. Anyone with a lit…
ytc_UgxYd6UlB…
G
Most people think of ASI when they hear AI and many who know of AGI think of ASI…
ytc_UgyIB2DOo…
G
Ai is as safe as a loaded rifle... that can think and act without human interven…
ytc_UgxPxfOA3…
G
I don’t think this will ever happen because humans will not have their shit toge…
ytc_UgzQWkgHk…
G
PDs laying off their detectives. All crime to be solved by AI and Google positio…
rdc_oa87475
G
wow look at that an actual intelligent person giving a salient non fearmongering…
ytc_Ugyw5yiGU…
G
given complexity problem of neural networks, only possible hope for safety could…
ytc_UgzkLsmjW…
Comment
What amazes me is the fact that now all decent tech companies are asking you if you use AI tools in your projects, or tell you right on the interview that they want you to use AI tools on the job - Gemini, Claude, Windsurf, ChatGPT, etc. - just to close Jira tasks ASAP.
What would happen if you tell them that you don't use them in your own projects, or that you don't want to use those AI tools on the job?
Well, there's a high chance that they simply won't hire you, and would love to continue with other candidates nodding their head and showing a positive attitude towards AI... We need a trade union in tech or something, 'cause the market is more ridiculous than ever before.
youtube
AI Jobs
2026-01-20T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw2p3waOkxzbbEBGbV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyWIJlwvJvTRsK427x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxEEEcyJIkyc0fYAXB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgypSkXh0hlK0cz5Hed4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyMteDSSvdZ1Zpnux4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJT-HnjLFlikLy7ER4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxrdQlvY6x7a1WOXwB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxF9-WyEqpTYHqyPQZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9DodY5vrlzvQcx3R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwlM2qcrEzxrFdVPqB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]