Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Interesting... but my opinion is this... humans designed AI and the computing ha…
ytc_UgzGo0fZJ…
G
There are jobs ai will never be able to take in our society, but obviously most …
ytc_UgwWYTWpf…
G
The human interrupts and talks too much, so much so that an anticipated 'debate'…
ytc_UgyDS9_gR…
G
I want content removed from my feed, I don't want to waste my time. AI could hel…
ytc_Ugw3RW75_…
G
All I know is humanity will have collapsed long before humanity ever sees an AI …
ytc_UgxziCgBd…
G
Wait until 2033 The evil 😈 Billionaires will decide who lives and who dies not A…
ytc_UgxSXE4BS…
G
just pu a job that cant be autonomous, worse comes to worse get ur passport and …
ytc_UgzgL3tjl…
G
Also, it feels nice to be nice. I don't feel comfortable NOT being kind. I can i…
ytc_Ugw-28Pgu…
Comment
It’s interesting, especially the bit at the end when they debate how AI will end humanity. Virus, or mirror life ect ect. I don’t think it’ll be like that, as it could be caught red handed before its release. Nano tech, the cure for all illnesses ect. We’d all willingly inject that, and it’d appear completely benign. Then after a few decades, all human disease is irradiated, 99%+ have adopted nano tech, AI sends a signal to sever our brain stem. Instant off switch.
youtube
AI Governance
2025-08-05T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwcpM_dzzsAqdyo-Vh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwvufGMO93nKgSQZTF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyMF1ck5eLCqabtmOt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyiyCL7W2fADdrrDlh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxxpeq-eDKJzq01g-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwdeC7KHgTZGLTn2ot4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyCuCExgWWXyqPZGlR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxylX64cdhPjmkl8wZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfAYmQXwtF1-uCpo94AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugw0UgAb8eioLCwbkUp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]