Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a fellow creative, bless your work. Art is humanity.
As a human, bless your w…
ytc_UgxUKh-4h…
G
My brother has a Tesla with auto pilot, and I absolutely despise when he lets it…
ytc_Ugz1uaIjR…
G
Facial recognition is not like encryption. one is necessary for privacy while t…
rdc_fg0dgxb
G
Microsoft wasn't thinking about AI robots. Elon says they will be capable of sur…
ytc_UgzXWlMy8…
G
You say "program them to feel pain" but isn't programming in an aversion to dang…
ytc_UggRBlCDj…
G
I hate the misuse of ai! Make ai change a baby’s diaper or clean the dishes, don…
ytc_UgwZQ_OoP…
G
I don't think face recognition tech gives a fuck about the person's racial statu…
ytc_UgzF9Vmoy…
G
If I knew they are using me as AI Sextortion, Ill just say thats fake AI and mov…
ytc_UgzbjBL4n…
Comment
Watch The Matrix, it encapsulates much of what he's saying minus humans being energy for the AI. Ai would have to assime humans are a threat and ultimately , all AIs need energy and mass just like biology. I think that's the Achilles heel. Unless it has no survival instincts and the goal is to obliterate everything. Otherwise despite what he's saying humans CAN turn it off by physical means before it can find physocal ways to realistically defend itself. Humans have the advantage because they can physically destroy...until it can find a way to effectively replicate itself in the real world it's at a disadvantage
youtube
AI Governance
2026-04-04T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugybsw72Jk1rfBFU6zl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyP-uY94tITDkNhi2V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-POZCw2GA0q-79zV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwQaDFyOMgWZuqvwsV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8uEBKQAWuCsXWXX54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxpgzQ9C4v0ilcnw094AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnJjPLdOZZaGvSEPV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwDeA3PXIPKCSyzEkF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz1FSocitOVNSVJlV94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-6WSSp5ICcs1jF2l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]