Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So while they argue against the dementia caused by the lazy electronic downloadi…
ytc_UgwF9hUsa…
G
BS? I doubt a Fact Checking, Pattern Recognition Neural Network would support ‘B…
ytr_Ugws38Dp8…
G
Well yeah, so what? That is called progress. Should we just NOT develop AI and l…
ytc_Ugy7uCX2m…
G
Too late. Cars have been rolling computers for a decade or more now, and now mos…
rdc_dkepiqo
G
I asked ChatGPT about it, and it said that the government is watching us and tha…
ytc_UgyoDKMov…
G
The safest job is now to shift into AI safety in whatever field u r, If u are a …
ytc_UgxJ8pevw…
G
imagine if we could use AI so everyone has to work less instead of centralizing …
ytr_Ugyxfnzg_…
G
Consuming AI models content will make the real models obsolete, that's good. A d…
ytc_UgxcZCHv4…
Comment
This is nonsense. This is transitory. Yes there are some AI evangelists being over optimistic about how rapidly we will get to AGI and they are driving hype for many reasons. But this idea that new jobs will appear this time is just stupid. If intelligence goes up, cost of intelligence goes down, the entity providing that intelligence will replace those entities that are more costly and less intelligent. This idea of the humans will just go to higher value - well the issue is that the human will be pushed higher and higher in jobs. And the human will reach their maximum potential to deliver this more valuable work and then they will be replaced. The tide is coming in, and humans need to keep climbing up the mountain of value.
youtube
AI Jobs
2025-10-29T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxPv1p2QQZ36gqOxnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzSlwjYY1INd6TGPI14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhcDlPA2461F6hDDh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwC74n_npm882ZFvOh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyoXN3cQuv__opWhUd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxKbndTDUhaqjeYR94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzxVUZAT7jo29jjOHZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyHcd5o77D9YIphAy94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwHrruNIgiw-aJn-p54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwicf3VwVhQMCgR91Z4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"}
]