Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nice work getting that out of GPT! I am sure it threw a few cold, machine-like l…
rdc_kvscjlk
G
Looking back at this two years later, while having severin different LLM models …
ytc_UgwN7cEnn…
G
Your assessment of current AI limitations seems to describe most humans I meet, …
ytc_UgwgiDHcp…
G
What a really bad and tone deaf take from Neil Tyson, because if we apply his lo…
ytc_Ugz_kL-eB…
G
What happens when they find out they've been playing with themselves
If AI was …
ytc_Ugx_K8Psn…
G
I found a copy of How the Elite Print Their Wealth in my business partner’s draw…
ytc_UgzmWFlZ7…
G
Faux vraiment être déficients mental pour avoir créé une telle machine IA c'est …
ytc_UgyR3nirP…
G
THE only PEOPLE WHO SHOULD BE PROGRAMMING FUTURE AI ARE BLACK WOMEN 40 YEARS OF …
ytc_UgxW3DN7P…
Comment
I love your videos, Sajjad! At 7:05, you mention that out of the 1.17 million who were laid off in 2025, only 5% lost their jobs "because AI automated them." However, I couldn't find any evidence for this. Mind citing your source?
I could only find a Guardian article mentioning that in 2025, AI was cited as a reason for more than 54,000 layoffs. That’s roughly 5 percent. But that figure doesn’t include all the cases where automation or AI quietly replaced human roles without being officially labeled as such.
As far as I know, no comprehensive, in-depth study has captured the full picture yet. And if someone actually did that research, I suspect the numbers would be significantly higher.
youtube
AI Jobs
2026-02-14T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwEt5MlUG2jIcZcF4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxaRK0srbK95XbRZ1F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy25rHtKOVjGbu-bON4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgybqwhtwpZtoBHMYvx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAEtpvdCTtBrxOKoV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyqtTDty1rjLgCtAtF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw2yhgZVTXd5hk9QqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDZXqn5Qy2y8N1FOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXeWpGbRM4yYOhvxN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxtPraVG3unuy9-0Hd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]