Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think it’s just existential insecurity due to how we behaved against creatures…
ytc_Ugw2CJh8o…
G
The rights of individuals have already been taken by parasite politicians alread…
ytc_UgxLaTYUO…
G
That’s when they start eradicating people by the millions the world won’t know w…
ytc_UgyhvEZLH…
G
Knock Ai all you want, but it’s the kindest, most patient, most accessible tutor…
ytc_Ugw-7jGGI…
G
It's the "G" in GPT that induces the hallucination effect, as the transformer is…
ytc_UgyT596gJ…
G
Karen Hao is wrong about tech companies not making their models more productive …
ytc_UgyGabqu1…
G
i think the point he was trying to make is that the stats saying "60% of all cod…
ytr_UgzxnPkL2…
G
who said robot will "demand" rights? did we asked animals or demand them? no
if …
ytc_Ugx2df_rp…
Comment
That's a brillant quote: "you can't keep up with the speed of development, it's going to speed up from every year to every 6 months, 3 Months, every month, then every week and then every day."
That is the hyperexponential accelleration of ASI.
One thing about singularity i can contribute, on my own thought.
My guess is, that once we have got a direct uplink to the AI-Digital-Space, then the singularity never occurs, because we are allways cerebral up to date, because we can outsource many things like like learning by heart, because we have allways acces to the facts and the state of the art scientific knowledge, which leaves us free for what human brain is prety good for. That will lead to the alloyment and meltingpot of human brain with neuronal ai and it's fundamental accessable database of knowledge.
youtube
AI Governance
2025-10-07T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgydKu9yuwr6L3oGhAR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzTVeNXca2gygEX9o94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0625gpRxwygf8_mF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3l_uHaYwIO1cmcCN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzda9HyzQ3bPuxVCGl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBzeN9JNkzvQfgHjl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzMAFRQeipukKwnfzB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-kUrs0CLM_-N4zMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxPDEY7-nXy9OLqoFF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3wZpKggohAm8h-494AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]