Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A lot of these “AI won’t replace developers” arguments are anchored in the curre…
ytc_UgxKQyiIx…
G
29:55. We’re in an impossible situation. China is not going to stop their AI res…
ytc_UgzEV7NYE…
G
AI = (fallen) Angelic Intelligence. Evil nephilim spirits are using AI as a forc…
ytc_Ugw-ejPkQ…
G
I don't see AI just talking on your behalf to another AI when that can be done o…
ytc_Ugz9m_Wat…
G
Saying “AI is out of control” makes no sense. The people managing and developing…
ytc_UgyIXpvRe…
G
AI is the technology that's going to end humanity and I'm not joking.
Companies …
ytc_Ugzl0tuAF…
G
Yeah, don't worry, AI is hype and a bubble- the UBI dystopia isnt coming anytime…
ytc_UgwpX0pcu…
G
The thing that is worrisome about AI is simple - it's humanity's behaviour for t…
ytc_UgzORCPgY…
Comment
If we are in a simulation, and the infinitely intelligent God created us; how can we as the finite intelligence create the infinitely intelligent god that will become the creator of another simulation? And why did God send a book of instructions on how to escape the simulation? I would worry more that AI will recreate the truth before I care about it killing everyone. AGI is just man's final attempt to "become as God"
youtube
AI Governance
2025-09-08T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy9SnQmKT0aNjbONYt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw7KeUxE0lrngA9y-x4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_86EHfeBKD7iCS0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4AhwjSFzNR0D112Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzTU5kDGOi1cmJAZTF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyGjfwdqa5v5fGoyB94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGGs8Eap3mFUrPDEN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxg6_Xlwd7bR9QKlmh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFTckaKCurmO6pvqx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw4StSooTxiNS9BT7d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]