Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@NeilGhosh yup, pretty sure it's just stock footage like it has always been, col…
ytr_UgwoggPyl…
G
What do you mean by "people fall for these things"? I dont think people actually…
ytr_UgwdVNGVl…
G
I think in office and Admin jobs it will be more of a oversight job, to check if…
ytc_UgxMAIC9H…
G
I got 2 AI adds while watching this video...
Although I do not believe AGI will …
ytc_UgxoubBpW…
G
Update to this drama: Atrioc started to help with his own time and also donate t…
ytc_UgzxVR_go…
G
Geez....this idealist Friedberg thinks there will be more abundance. Yeah, sure!…
ytc_UgyMGtqDm…
G
Don't blame God if choose not to listen.... and Don't blame AI or robots if you …
ytc_UgyWQMRyF…
G
I understand your concern! The evolution of AI and robotics can definitely feel …
ytr_UgxRF_GPj…
Comment
Demis framing AGI as a consistency problem rather than just a capability problem is the key insight most people miss. Current LLMs are impressive but inconsistent — gold medal math one moment, counting errors the next. True general intelligence requires architectural changes beyond just scaling transformers. What's particularly relevant is his point about the incremental transition vs step function debate: edge-distributed AI models that run persistently at the source of data may actually accelerate that consistency problem being solved. @edge-41 is working in exactly this direction.
youtube
2026-04-06T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzUGy4iLBTkcGkUE914AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxpgUgHIIxpmKt_3o54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBd8L5QHxD-rZX_9N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAtuqnLcauoseX0EJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwojJ8LoSMquFTqCMN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSMFLUg-zXdpQl55t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyevvChNwdD8uGqBWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRmKCqXQ_IHZhra9J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz25YVlAhboRUZhXnJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwAnALOn5woNm7c1wd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]