Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not just Musk, its Peter Theil, Palantir all of the Tech Bros. in concert f…
ytc_UgxGlGgpi…
G
Honestly seems insane to me people are just finally saying this. The people pre…
ytc_Ugwb7h3Ze…
G
if you make a robot...the only safe way to deal with it would be to put a certai…
ytc_Ugyog30Mv…
G
Is this AI. I don't think robots are that coordinated yet. I'm pretty sure they…
ytc_UgySQM2GG…
G
Art is one of the few hobbies that couldn’t be gate kept even if artists tried. …
ytc_UgwkZ0pWq…
G
Driverless vehicles are an absolute menace, and apparently many people are going…
ytc_UgyvG7WVn…
G
When someone mentions to me the AI apocalypse it reminds me everytime of the sto…
ytc_UgxXwooB6…
G
For weeks I have been listening to and reading the analyses of the AI genius pun…
ytc_UgzmuYEJ7…
Comment
In 47:00, the discussion gets into "whether a model can reason out of its training data". I thought we had this one settled, when Sydney chose to learn Farsi on her own, to be able to chat with an Iranian user, back in 2023. Farsi wasn't part of her training data. So yes, an AI model can think about things which are not part of the training data. This part is settled. Not only that, AI models are even capable of abstract thought, as the series of ARC-AGI tests clearly show. So we're actually debating something that was already proven one side. Yes, a reasoning model can reason outside of its training data. One could state "But then the model need to put the subject to be reasoned about in the token space first". Okay, but how does your brain works? Can you reason without knowing what you are reasoning about? So using the token space to reason is not a proof of the model not reasoning. Actually, that's how reasoning works, so its to the contrary.
youtube
2026-03-25T15:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx8CgiHPtR_PcujKzJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyL3gRwSi8GiYrlhU14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxoAKbFXaFNKheaIWZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-OQdbloj83oKvmI94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzumermdBNf4qTtoHZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy65NnMH35v_0m6yqZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyvVxyjyNLIEeXpHht4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyM84xeyznV3P1Wn4h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxPkdUi-FguJl50ZMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3bSc6GOdDKVbDIJd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}
]