Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Doctors will always tell you not to use AI because they know that one day it wil…
ytc_Ugw65OZYF…
G
You will lose your job. You still do mot know what AI can do in robotics. It wil…
ytc_UgyFTqpqx…
G
I asked ChatGPT if all the accidents and deaths (too many to list please researc…
ytc_UgwJ5kIcv…
G
I agree totally with this white American big boy his putting himself at risk for…
ytr_UgylDMZG3…
G
I'm pretty anti-AI as it stands, but there are a few models out there now that a…
ytc_Ugzp9xxYI…
G
I’ve found AICarma invaluable for understanding how AI perceives my brand; it sh…
ytc_UgxRHiWTK…
G
Blaming AI instead of corporate greed and outsourcing jobs. AI certainly is star…
ytc_UgzbYC7XZ…
G
16:12 AI tools are probably going to die in the same way photoshop or blender di…
ytc_UgyCzXIl3…
Comment
I can't understand on what basis he thinks we'll have 99% unemployment yet at the same time everyone will be provided for in terms of food, services and housing. The trend of increased automation certainly over the last century has not changed food poverty in the US nor in the UK. These are getting worse and since when are governments or businesses motivated to financially support the unemployed? The economic system entails little to no relation between tech development and public welfare. He even makes that point yet he thinks AI will both create mass unemployment yet rising housing and global food costs will somehow not be an issue? I really respect this guy and his area of expertise but can only imagine economics and politics aren't not within his areas of knowledge
youtube
AI Governance
2025-09-09T14:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz22qWIe69GZQrRt814AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxRzNb2Yg40qKxA3d94AaABAg","responsibility":"elite","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyBPtnImjgUAYFQbiR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzB1yDwZn6dviz_lz94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyoLgHcFZiyp7Rr2n54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzv4JclMVDAPnQ1c6x4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxFdCodpn06soV7HN54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfcitndtjQA1T2JNp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz9ubZVHOoVs7l_B3p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxoTLPHQw4tefmBDe54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]