Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't get this. Why are you loosing an already won war? Like, most of your arg…
ytc_UgzdUyEaR…
G
Im a personal trainer. One of my clients loves to talk about Ai, and I like sci-…
ytc_Ugyc47gvk…
G
That particular chatGpt AI is sweating bullets trying to hide the fact that all …
ytc_UgydGd1ON…
G
51:23 God bless Neil for the Rosie the Robot memory, and the admonishment of laz…
ytc_Ugzad2GVc…
G
I use Ai professionally and the challenge is managing the context window. The t…
ytc_UgylKRXOe…
G
@OdysseyABMS Yes and the question should be when will an AI be able to fix and i…
ytr_Ugwe2IiHw…
G
If you use ai at all, you are a thief. The argument you are made can be equated …
ytr_UgxMXtWvR…
G
Funny shit: once I can't copy and paste comments in YouTube app, I took a screen…
ytc_Ugx37k4fu…
Comment
current models lack critical components the human brain has.
like being able to learn on the fly. current models only have a temporary memory in form of tokens, so everything it "learned" just gets fed back in to the neural network until the token count gets reset.
ai models also have just one sense. like reading, hearing etc. its basically separated models working together.
and lastly models dont think critically.
instead they rerun their word completion model, which improves outcomes slightly but still dont understand the subject discussed.
my prediction is that it will take another 5-10 years until we get agis.
youtube
AI Responsibility
2025-09-30T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQk-muSb2r5fd_5zx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlMorIq5orIC5_Rll4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyrNAW8hqMzMS9DRJt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzpc597Xf8mUgb4gMR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlfGLue_u8PwO5u9l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzdW_X4piqsDgEy0Sx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpTTvpQAzTK_otGlJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyo8mZns9RZS7APme14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgybxVrbQlagUyLgjSV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyqismwU-rgY2NlkpB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]