Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To all humanity your replacement is finally here now they can kill us off in abu…
ytc_Ugy0q5jAW…
G
One aspect I find hilarious is that someone posted a "redraw" with Coraline and …
ytc_Ugxoq4GFA…
G
It would be nice if all the links here and in the rest of your newsletter were c…
rdc_jf6s0rs
G
Interesting. If its so simple, then create actual quality images using AI. Peopl…
ytc_UgwYxiUvY…
G
I think AI art is fine to d@#$ around and make shitposts with, AI art will neve…
ytc_UgzEqMXzh…
G
Guy didn't know what type hinting is?
At the time you've asked him did he ever s…
ytc_UgyljHFsy…
G
if you ask the ai to find a number that doesn’t end with e, it will say “nine”…
ytc_Ugy0tktcc…
G
They need AI because there is not going to be any humans with these globalist lu…
ytc_UgzulBE3b…
Comment
Beg to differ with Mr. Harari on the AI being Alien Intelligence v Artificial. AI as we all know it is a man made creation with the intent of mimicking the abilities of human mind (memory, logic, inference, semantic associations) at scale and at speed, whilst still deeply void of our natural creativity, free-will, and emotional interference in decision making. He is wildly extrapolating the concept of AI.
youtube
AI Governance
2025-07-22T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwBmIud28l_qtSRqT94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJ-0xOEbWLoLhSZFV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzL9HzXDKb9ha0i6Rl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgztrpcDLVlnizzWtht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7vzxbFJBL1M3BlBl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx6M_WEJeFM6BYGLfh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxdfryuQSOLLmxPnfh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznzNf6vSFBqlh3F4R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFPPFg-dMfhv7Z8cd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwoE1RjmODEQJwnjZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]