Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very nihilistic. What makes this guy think that you can't combine super intellig…
ytc_UgyJYsIe1…
G
AI is slop. It's not going to gut the working class. The Owners are going to use…
ytc_Ugw1ELt1u…
G
At first I was convinced that using AI for reference was fine, but now having th…
ytc_UgwSVwU3R…
G
atm as much I can see as a laiman, I am just "scared" of AI breaking any passwor…
ytc_UgwXjzmz4…
G
the biologist michael levin draws a very remarkable distinction by proposing "wh…
ytc_UgzKlY65S…
G
As tech people I would think that remapping the AI key would be obvious. Its of …
ytc_UgwmdTGx8…
G
lol it's hard to write good code using LLMs! we're doing that at [https://github…
rdc_jtrmniv
G
@knobwobbleno you dont need to. there is reality and the tool just need to show …
ytr_UgwaNINYC…
Comment
sorry but anyone that thinks ai is going to replace all of our jobs is retarded. ai is fucking stupid and bad, it's just barely good enough to trick you into thinking it's competent, but it absolutely can not replace any single job in america. if you think it can, you've been tricked.
youtube
AI Jobs
2025-12-23T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwJVFhEAUIQFqsBA-h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyCrfP-u0bCnMThxz94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwkBv16azEDGak8Twx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwQp0NLIBFUJD7X_gV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxsOFfyoBfVoO1NKlB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_Ugx0C48P5I64YuGzKMh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugx3bMkRGq39uderXfB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgwSxniyG0tJigw-WDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"skepticism"},{"id":"ytc_UgypohBxOP1bOeBES194AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgwX2Rg2f6D-eZUrg8N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]