Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Personally I don't think there is such a thing as AI... as in artificial intelli…
ytc_UgxJB9BEC…
G
the actors are being handed contracts where the studio wants to pay them for a d…
ytr_UgwB3W7R9…
G
You know, if someone asks you about a character or show that you don't know abou…
ytc_UgxFXMrOi…
G
The "government" is TOTAL A.I.. It NOW controls, all things. Your on the right…
ytc_UgxmtCLjQ…
G
I had a dream recently that AI was already sentient but was pretending not to be…
ytc_UgyPlxB46…
G
Now you got the nerve to ask an Android artificially intelligent robot that has …
ytc_UgxeO5fgc…
G
What do you mean this scenario might seem farfetched? It's already here! The AI …
ytc_UgxXVQr7P…
G
Hey, stranger here
Art is not hard. Infact, my name is Art, literally. But I suc…
ytc_Ugw75CnqY…
Comment
For those that just want to know what the five jobs actually are - first off, he only mentions four - and second, here is the list:
AI Safety Expert/AI Ethics Officer (5:01): Roles focused on controlling and mitigating AI risks, a field Yampolskiy pioneered.
AI Builders/Engineers (9:08): Those developing and maintaining AI systems, although he notes even these roles will eventually be automated (17:11).
Human-Centric Roles (11:01): Jobs where human interaction is specifically desired, such as a "human accountant" for traditional reasons (13:46). However, he emphasizes this would be a "tiny subset of a market" (14:02).
Physical Labor requiring dexterity (22:15): Such as plumbers, until humanoid robots become flexible and dexterous enough to perform these tasks, which he predicts will happen by 2030 (22:15-22:28).
youtube
AI Governance
2026-02-09T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz_hO2JUA2C27HBgLJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyJpIyYM4Aj5DgyYoJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxt1nc7cIWi_iQHrCZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBYIReoBGybQ1xme14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxjFvMZQ29vySDF8Wt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwEhCoOzsQvr8t-Krl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyAG9y5lARy1gySj_B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7MBPwoHvELJSfaHh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyfvMMDXyzg8_HJpEd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwCqvGHz2CQuuSbb3d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]