Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
hmm... without mandate from federal government, which I do not support, it is im…
ytc_UgxyjZBBF…
G
"Deepfake getting more virality than actual news segments"- MSM is worried about…
ytc_UgzG1GEHf…
G
Makes me sad at tattoo conventions too, you can see the AI LAZINESS taking over.…
ytc_Ugz1_WKid…
G
OpenAI does not offer any AI products. LLMs are not AI. There is nothing scary o…
ytc_UgyGWtCjO…
G
For AI companies, the most terrifying thing is for the stock to crash and invest…
ytc_UgxMVuUkC…
G
Lol, I always talk to AI like I'm talking to a fellow human being
I even say Go…
ytc_Ugw_YVcYt…
G
And it gets worse the further up you go. The most powerful people in the industr…
rdc_n60t8z7
G
Let's just say... The AI got 𝓯𝓻𝓮𝓪𝓴𝔂 first...
ALSO I HAVE THE MOST ANGSTY SHIT …
ytc_UgzLw4X5R…
Comment
Let's see:
Stage 1: Correct.
Stage 2: Mostly correct, but this is not context. It's association.
Stage 3: Partly correct, but it isn't that accurate. Watson actually failed. And Alpha is very limited.
Stage 4: This is still association. It is very limited and not very accurate.
{moving from reality to myth and misunderstanding}
Stage 5: Incorrect. AGI is not directly related to AI. It is not a stage.
Stage 6: Incorrect. ASI is related to AGI but not AI. Also, there is no exponential increase.
Stage 7: There is no such thing as self aware AI. This would be AGI which is not related.
Stage 8: This is now just fantasy. Even ASI is not transcendent and there are no nanobots.
Stage 9: Nonsense.
Stage 10: More nonsense.
This video went from real to ding bat crazy in less than 11 minutes.
youtube
AI Governance
2023-10-30T08:0…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwPh4abbPTk01ekasp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzggv6ikwgryIrFlPN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDZP5RrShJ4zfCa2p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxQ17UN_r5YPN3xCpR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRfd10vvefaEdDOvp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwS5bxrsyki6c1lo8l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwCx6WRGFK10ibn5rl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxlk1NNdA2C-zNOTwZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1LWu60lvwsW28gWJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyuNRZGiRbenoaOw5t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]