Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You can solve any problem by simply killing everyone the problem affects.
When…
ytc_UgyMBv9tv…
G
My Phase I rack level Agentic AI system will be designed with on purpose in mind…
ytc_UgxeJI8kv…
G
I can also explain to you why the AI will replace all jobs, which means you will…
ytc_Ugw5Yc-4q…
G
@conspiracymusic with conspiracy in your name, surely you have considered the si…
ytr_Ugw_iX4V5…
G
it didn't inspire people it enraged them. Ai stole from millions of artists to c…
ytr_UgxYxv7iB…
G
The Matrix, 2001 A Space Odyssey, I-Robot, Portal, Ultron,
*Gestures vaguely at…
rdc_m15fnaq
G
If you have 1,000 programmers, AI will make it, eventually, to only need 50 just…
ytc_UgxG1F_sC…
G
As someone who likes to play around with AI art I can say that the current versi…
ytc_UgzWMM5Ws…
Comment
As you all know, LLMs are dynamic non-linear adaptive filters that perform lossy semantic and syntactic compression. The larger they get, the more they are able to BOTH memorize specific facts AND generalize abstract patterns, due to deep double descent. This is like a human with eidetic memory (like Rainman): they are reconstructing the text probabilistically from weights, not retrieving static files from a hard drive. So it's not quite like file compression because the reconstruction is probabilistic and non-deterministic. I'm happy to argue this point and if I'm wrong anywhere, it's here. Still, OpenAI and the others ingested trade secrets and copyrighted text without consent. They essentially laundered intellectual property through a neural network, stripping the metadata but exploiting the value while pleading ignorance. Saying LLMs retain text might be incorrect, but per copyright law and the de minimis non curat lex concept, I think the AI companies are cooked.
youtube
2026-01-18T21:3…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyunfLve-L562axr7N4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwqVXhT_hhqAq2D21t4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzCo0L3KqyiPARg1xd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMoAD5ELYYfvoDTB54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpJEegG83Fv_WtRIF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxgskV-Ok6_SaEkk7d4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwhAzbtRO6qwJg2f_l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwR1tiOLPAT6eKSEjN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzx61ugRZ3Pih6Aaoh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzSJedXilFTX30pJwt4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"resignation"}
]