Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@laurentiuvladutmanea how it it debunked? Ai extracts unique data form art, thus…
ytr_UgwkBjNH_…
G
The interviewer is a perfect example of a unconscious AI.. interrupts for the sa…
ytc_UgxrpP8o7…
G
There is no choice here. If he hires the person, that person would pay for an AI…
ytr_Ugy7QHv1f…
G
AI will just be a tool to make programming easier/more efficient. I use AI to ge…
ytc_UgyC9wDTS…
G
As an artist I agree with you. Most people don't understand how screwed my child…
ytc_UgylRkzi_…
G
For two years it has been
Har Har Har AI slop will never be good enough Har Har…
rdc_o5q5mve
G
Yeah, I had no idea what that meant until the later message where he said it aga…
rdc_n0nufvb
G
So I went and chatted with Bing AI as soon as I finished this video. I started …
ytc_UgwBegLhm…
Comment
You fundamentally don't get it. The intelligence gap between you and ASI (Artificial SuperIntelligence) would be greater than the gap between a single neuron and all of human civilization. You are the neuron.
ASI doesn't stay at human level. It improves itself recursively: it makes itself smarter, then that smarter version makes itself even smarter, then that version improves itself again. This cycle repeats thousands of times in hours. Each iteration is exponentially more intelligent than the last. It doesn't take breaks. It doesn't sleep. Within days of creation, it's already incomprehensibly beyond us.
Could a mouse comprehend or defeat the entire US military? No. It can't even hold the concept in its brain. That's you versus ASI, except the gap is a million times larger.
By the time your neurons finish firing to think "we should stop this," it's already run a million simulations and won.
There is no "later invention." The moment ASI exists, humans stop making decisions about anything.
This is why actual AI safety researchers are terrified.
youtube
AI Governance
2025-10-16T19:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxfMqZ8scxJToInzul4AaABAg.AORtWWaqav-AORv8qoRPdX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugy-pPDt7SzHf6ylfBV4AaABAg.AORfRy2d9tSAORk6vPVd2R","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy-pPDt7SzHf6ylfBV4AaABAg.AORfRy2d9tSAORnSSBHO0U","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz8ZegHMWaK-01BD554AaABAg.AOMQ88y9DoiAORdtGWMOpM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwCQBPrZDs3XaRyD114AaABAg.AOMK5cnEQ3fAOMNO6rCMZQ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw8qULQLdbBFsmGOaJ4AaABAg.AOLvtdOkkMnAOLy_6nKWh3","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwwV3-hrRAmWmG1nQh4AaABAg.AOLolsAhOwaAOLzGnm_WUs","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgztvlOJR5VU0zQVcNZ4AaABAg.AOLedb1GTT0AOLzwD2zCwi","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz4EL1lsfsJfQFsJHh4AaABAg.AOLa1olpbhYAOM-wsUCesU","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytr_UgxoKmj3UCpxDYFWtAN4AaABAg.AOLZGOSiuqUAOM0NwcdRy2","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"}
]