Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The mew Ai will replace all humans maybe and the USA Goverment will kill most US…
ytc_Ugz8oDCp3…
G
Bruh, if Tesla autopilot fails, and you have thicc chunky obstacle ahead of you,…
ytc_UgxJP_4d4…
G
I'll stick my neck out and say this: Everyone arguing in favor of the workers is…
ytc_UgzrCQCqc…
G
All the AI did was rip bits out of actual art, and then make a collage. The AI i…
ytr_UgwyLgrV-…
G
I'm in marketing, I helped, in my small way, a big brand in RSA grow from startu…
ytc_UgzrjUWpd…
G
I have a lot of respect for this guy, but he lost me in the last few seconds on …
ytc_UgxQt1z4w…
G
I feel like the thing we miss when talking about misaligned AI is that the corpo…
ytc_Ugwaf8pzY…
G
i like the plug for the war game at the end,
"now lets train that AI a lil' more…
ytc_Ugx5QzKuP…
Comment
This is more science fiction than reality. Humans have to create that AI and program it to do what they want. Not to mention that many jobs are inherently human and would be nearly impossible to replace. I don't think there is any chance of something that extreme happening in the next few decades, or likely even in our lifetimes. But if it did happen, the answer would likely be some dramatic change in how money and economies work such as UBI. I discuss this a bit in the full video that this short was taken from.
youtube
AI Jobs
2023-07-10T02:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyEXNQM4BsJJ_2blnd4AaABAg.9tYv_aQwrVg9td5C7VvjhB","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyEXNQM4BsJJ_2blnd4AaABAg.9tYv_aQwrVgA-shk8J_q3N","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx0jSp_Gqnj0Cukph14AaABAg.9swajZfYjQHAKXEtp72HNa","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugw96SAzq4vIACDCSTt4AaABAg.9ryTTkiVBxs9ryXfJaL1FF","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw96SAzq4vIACDCSTt4AaABAg.9ryTTkiVBxs9rzuG8sesbI","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugwfdj_Bqzzo6meliF94AaABAg.9rm6hb-AjXD9syN8gROxGC","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_Ugwfdj_Bqzzo6meliF94AaABAg.9rm6hb-AjXD9ulxsLcsfC0","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgyxU9XMyjLwyaQD-yZ4AaABAg.9r4DPDMWpRv9wkYFtjfBHn","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_UgynvGhxRdgO0TXARzd4AaABAg.9pwi9yU2O1GAMCWKVYRzqu","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxJiqSZlckQ0x8do_F4AaABAg.9pCFpnS9BOH9pTIMTO8pYp","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]