Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Neil is not an expert on genAi. he's a theoretic physicist or what his actual PH…
ytc_Ugzy6GwSE…
G
I’d argue it doesn’t have _no_ value. After all, the viewer provides perspective…
ytc_UgzvVl4re…
G
That's because, technically, it is. There's a reason that most actual AI researc…
ytr_UgwmIGCsU…
G
I don't buy into the idea that artists will become irrelevant with the explosion…
ytc_Ugx2sArAQ…
G
Such a pile of bullshit. AI can't even do junior work unsupervised and it's stil…
ytc_Ugw5AVFpk…
G
I don't think Ai Shartists understand the value of actual human beings drawing s…
ytc_UgzvQU9kZ…
G
Honestly, let these companies have all the AI slop all they want. If you don't g…
rdc_o8eihg0
G
God said in the Bible, that there would be a "technology" that will control the …
ytc_Ugz6WolHx…
Comment
80% success rate on 1-hour tasks sounds impressive until you realize the 20% failures still require full human review. Nobody's deploying unsupervised AI for consequential work yet.
Capability doubles every 6 months in labs. Deployment in risk-averse enterprises? 18-36 months behind. Gap between "can do" and "companies trust it to do" is where careers still exist.
youtube
AI Jobs
2026-02-25T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwKfb8be6Br5iu0Tap4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxRUnioLoRQViXENAV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwiMYnjCH_XcxpZukR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgycO39f-5FsIGv_E354AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzsVLPUiY-AYltHC5l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_lgC1RtLglqFGIPN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEs8gw-f1gxxo38yt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy19OxIMZ6ySL1v7q14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzDb3goqByhP1Ionk94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw_fGUYt0_ivp6B2Ed4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]