Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People, especially young people, have been coming up with crazy ideas with amazi…
ytc_UgxpIxEJy…
G
@joebro4411 This is outdated logic though. We have decision neural networks that…
ytr_Ugz747iyi…
G
Well done Mr Sanders, you make very relevant points and propose good solutions. …
ytc_UgzAzF0UP…
G
Ok 1% chance or 5 or 10 or even more of extinction with the creation of AI but w…
ytc_Ugzqr2pwW…
G
It’s all about greed money power controlment AI is not a good future. We all kno…
ytc_UgzTy0OGn…
G
It’s kind of scary that nowadays children are being groomed by computers because…
ytc_Ugwv4oLk6…
G
A lot of the technologies that took decades to hit the mainstream were artificia…
ytc_Ugx6jTbz9…
G
Im imaging drunk people taking the food out and eating it then the using the ro…
ytc_UgzcdiIuG…
Comment
@mattbeisser3932
I could be wrong then, and they could be lying about storing text, which is likely. My argument about the unfeasiblility of storing training data is probably wrong when it comes to text, and only entirely true for image generators. However, there's another plausible explanation still.
Rather than giving equal weight to all training data, they probably gave really heavy weight to the most popular books, as they would produce "the best writing". If you feed the AI a chunk of text that verbatim comes from a book with heavy weight applied in training, it can only output the rest of the book's text with 95% probability because its training tells it that those words are incredibly likely to be what comes next. Just like how people can memorize a speech given enough repetition, given enough weight the LLM can reproduce a book, and it's capacity for doing so is beyond human.
youtube
AI Responsibility
2026-04-11T13:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxgBQWvEZxSE7kBlCN4AaABAg.AVT-2cekWzCAVT2xo2tFeZ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxgBQWvEZxSE7kBlCN4AaABAg.AVT-2cekWzCAVT5CB-6Vyw","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxgBQWvEZxSE7kBlCN4AaABAg.AVT-2cekWzCAVT5gxBHHw0","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxgBQWvEZxSE7kBlCN4AaABAg.AVT-2cekWzCAVT6R0zkc3b","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgxgBQWvEZxSE7kBlCN4AaABAg.AVT-2cekWzCAVTR9jWikNG","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxGfac71Bv90qzy90x4AaABAg.AVSz_XNa0RAAVT02egDCWE","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzAPpAjuyOTe7mHfdF4AaABAg.AVSzEo46MnjAVT0FoKO21k","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgzAPpAjuyOTe7mHfdF4AaABAg.AVSzEo46MnjAVTQpaZAaVx","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyS0R6NCfSUpbfw0cZ4AaABAg.AVSzAbUkVZdAVT-bZr_wEE","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgydEieeJ3CQie_IVjB4AaABAg.AVSz3wVo0kFAVT1h_kZjGP","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]