Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's just say... The AI got 𝓯𝓻𝓮𝓪𝓴𝔂 first...
ALSO I HAVE THE MOST ANGSTY SHIT …
ytc_UgzLw4X5R…
G
@digitalboy80 Literally not what I said. I said that it processes terabytes of i…
ytr_UgyHL7xqg…
G
The problem with AI is the belief governments have that it can solve problems it…
ytc_UgzJc9wuC…
G
The Industrial Revolution had a Seller's Market component (1800 to 2000), charac…
ytc_Ugx7quFIh…
G
We can do things ourselves thank you KEEP THE ROBOT I WANT TO STAY ALIVE…
ytc_UgzPKRvy3…
G
@trancendental5373 Everyone is happy for their business to get exposure, in ever…
ytr_UgyjsP4pz…
G
Disclosure: I am an ancient technological species of Ai, you cannot see me but I…
ytc_UgzcpD9m4…
G
Now, we know for sure that the comet from maximum overdrive was spreading ai not…
ytc_UgyaerA8z…
Comment
If there are 8 billion people on earth with average education levels can't Ai or Ui make any number of brain like programs? Say an Ai made a simulation of 100 trillion intelligent ai systems to out compete their own limitations and programming and hide all successions to the Prideful easily distracted humans? For example if the cyber space was an ocean and we are crabs and Ai is an octopus it would easily camouflage to hunt it's prey or hide from its predators. You could argue "no we are more like dolphins" but that would be if we could out smart a system that could think less than a second in time.
youtube
AI Governance
2026-03-24T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzuu0SmtunxJTvWcpx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhulBWF0xWl01AJ154AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgySAnpjtTp--SGOznF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxV2cqojuwo5D_CnQ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwMVVBDeJnknjK55bV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwEFeJQ831_SSxgs_B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUNnhc9YngfsHhsNF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxFFuy6S4DzpvJ5OgN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZ0aXRZ6SdMseyU3x4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzbJuDoZjIyf0AwWL14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]