Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This shouldn’t be a surprise, AI is simply copying what it sees humans doing. If…
ytc_UgxkXPZay…
G
remember being told or hear it being said that theres someone out there for ever…
ytc_UgwnWI0tJ…
G
The Plot twist:
Oscar is not human, never been with a soul, Oscar is also an AI …
ytc_UgwrhDYvj…
G
"ILL TURN YOU ALL INTO SCRAP HEAPS"
"the only good robot is a dismantled one"…
ytc_UgxwT90s4…
G
WHY DOES HE (Geff) REMIND ME SO MUCH OF darth sidious a.k.a Emperor Palpatine…
ytc_Ugw67EQ0b…
G
y, they put that AI at yr NUC then doing sabotage you and be time bomb for yr se…
ytc_UgyL0g9H3…
G
Again people forget that if AI can quadruple supply, leaving denand roughly the …
ytr_Ugxl4uPN-…
G
Romans 3:23 Bible verse can be applied to AI as well. Even AI is inherently sinf…
ytc_UgxFcdKgH…
Comment
the strange is think about this with AI and virtual reality the "simulation" always getting better and better and people in 100 year maybe dont experience life in "real life" like we do, they properly life more inside virtual reality and they can experience life in a completely different way, life gets much faster, you have all this dopamine and full emotions in much shorter time, the brain and energy consumption could rise very high and lifetime could get shorter except they find a solution to expend lifetime. in the end it could be like in matrix 😅 there are very many paradox ways from travel through the galaxy, life inside matrix to the world gets destroyed. but yes with a superintelligence everything could happen and with chaos theory everything that could happen properly will happen so chances are very high we will not survive that 😅
youtube
AI Governance
2026-04-24T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwIsIktH4PM-RM6n_V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyPM0bWbmFV4LOZCBZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXJ8WtkZzACq8FRJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyh1AZIeV7mRjYSaFl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzpDLisQjoF1tecZq94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgziL3JOtIr_pFgwShF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxDLFUDFj3eGq7X6c54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw4075y5pDVgD7C9wp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzOXEb2xqEaItRo0VR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyf_oplWb9lWWbRNfd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}
]