Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The fatal conceit of the naive technocentrists is not in building the machine, b…
ytc_UgzLpNF1G…
G
You have to understand that these people using 'Machine Learning' (calling it AI…
ytc_Ugy0k8yql…
G
There is no such thing as sentient AI. In fact, we have no idea how to even get…
ytc_UgxITjv_o…
G
no need for the robot. all u have to do is just nag him. he’ll do whatever u wa…
ytr_UgwGdmAyD…
G
My first thought when thinking of AI and or human interface was a reference of i…
ytc_Ugw4Ay2Ri…
G
I will add one more comment. Humans are unpredictable... I don't think AI will …
ytr_Ugx-Trh2p…
G
AI IS COMING TO ALL our SCHOOLS….And that’s why I took my kid out of school beca…
ytc_UgxGoIJw4…
G
There is a much greater chance that calculating and scheming "humans" will schem…
ytc_UgyzJETUg…
Comment
1:03:13 This is a great comment.
Past industrial revolutions involved replacing human JOBS, and that was fine, as others were created.
The AI revolution is interested in replacing the entire HUMAN. That's fundamentally different.
You won't see Eric Schmidt or any other CEO admitting this. But every CEO has a direct incentive to reduce headcount, and increase efficiency.
youtube
AI Governance
2026-03-23T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzCsA1kKH_a83DuZlB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtHj6Nej8RDWPCsex4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwEFL84C8sqAS1Su7h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugwjcte9LMuW-ExKvm94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwufRoDYqoxw6wkVZt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyJKAzv2owS7cT7OTl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxgfHQBbh5JEulMCV54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzMO8u5rWJu6QD5vm14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAKipvhU6djXTDoeN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwnkk183NePkYh2ceV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]