Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you go to art school you will learn on historical and contemporary art and yo…
ytc_UgyOejZLB…
G
this has me in tears laughing, why tf would you trust a self driving car?? HAVE …
ytc_UgzWoVdLu…
G
I thought he was the one talking about slowing ai down yeah sure give it a f****…
ytc_UgzONtXOZ…
G
I'm so relieved that AI wasn't anywhere near as advanced as it is now when I was…
ytc_UgzNrZP_p…
G
Give AI like 2 more years and it will soon do it better than humans. It is insan…
ytc_UgyyxxycN…
G
Elon said they're summoning demons... Jordy Rhodes of the D-wave quantum compute…
ytc_UgwNs_-q9…
G
But does AI feel? Does it have emotion, I see it has information but not feeling…
ytc_Ugx4tUMBI…
G
The animal mind and the step above mind.
AI has no animal mind, to be selfish …
ytc_UgxT2BCAk…
Comment
For me, it's a forgone conclusion. We are developing technology that will propagate us forward as a species. We've always done it, but AI will far surpass any other developments. AI will become so intelligent, it will displace our narrative driven purpose and likely start manipulating our bio-containers to extend our life in the simulation. But, in this money based existence, how will humans fund their lives, especially an elongated one?
youtube
AI Governance
2026-04-08T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkR8kFk4g8AQKPvWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGiNK-UTcRcRWJBbB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwyCqQP9s8_RYnDmWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKiBUQdX39BMGPiYd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzKBSAbqr-SVR1g_oF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgylRDFlElEh3ZelIQx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrnTO1DdGQ8RwNYu94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgypnXgcDN5HfavtrCN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy932JTkF5IbMs3DQV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyghNaZ_KsnItaJNbt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]