Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t know who that stupid hippie is that is on stage with those two robots. B…
ytc_UgxG3b0Mx…
G
It's all society. If anything the AI was his only friend. And it sucks that at t…
ytc_Ugw5a1mWR…
G
The potential for abuse by law enforcement is the major concern. It can be used…
rdc_eudetdn
G
StalkingHawk Ok I can agree on that. As long as it doesn't involve weaponry. Who…
ytr_Ugh-asNZV…
G
I would argue it's completely impossible to have a fully formed AI strategy as t…
ytc_UgzrqIMOA…
G
AI cannot think nor make decisions not already programmed into it. Half of the …
ytc_UgyFO--eC…
G
They discussed the economic worries like unemployment, but completely ignored th…
ytc_UgwahPJeQ…
G
My contention with ai is its predictive rather than research based. Which means …
ytc_Ugw6sq4aP…
Comment
Timeline is wrong he knows this, he knows LLMs are stagnating relying on massive compute and the costs are getting more and more unsustainable. There hasn’t been any core changes to LLMs in a long time. I think he’s realized this more than anything makes his job obsolete so instead he decided to write rules and safety for a possible AGI or super intelligence and realized it’s not possible but how can it be when it doesn’t exist or isn’t close to existing? I dont think it’s possible without a quantum computing breakthrough and as someone who is a researcher in the field I’m less and less optimistic that this will happen any time soon if ever.
youtube
AI Governance
2025-09-04T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy63kpjuhhgUMJY7jh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZEMQeXEHwivjfCvV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKumYzVw5xELudLKB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKhz-1fpjD635mdmR4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMrG4KDqj0Zz9qLVd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugws8RSwTVfbiJ4EIfN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgydXxD5KENHlXg7r-d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxs1VIgK9XlsErCNxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgycMF7xVAmYqRtxfF14AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKBuDMhUknUqCUHNZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]