Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why not buy a robot and let it go to school rather than their children.. robots …
ytc_UgyZrzyv8…
G
These two didn't use ChatGPT as a resource, they used it as a scapegoat. Surely …
ytc_UgweKBRmX…
G
Super smart man. How he plays with words in the questions that he is asked, e.g.…
ytc_UgwRwuZDJ…
G
I wonder what people think about such a perspective that AI in combination with …
ytc_Ugwxt5Cdx…
G
The funny thind is , all these smart idiots keep going forward with AI...lol did…
ytc_UgxFnIFVj…
G
At this rate we should destroy these general intelligence AI’s now before this g…
ytc_UgyVG-74W…
G
@danielodeniyi8729Not physical labor ai is bad at object permanence and delicate…
ytr_Ugzhz2oN9…
G
At about 5:40: "The U.S. Copyright Office has already decided that art authored …
ytc_UgwTQbec3…
Comment
The current model for "upscaling" the AI model is to crunch tons of data. The process requires a lot of process and a lot of energy or a lot of time.
I really don't see how you could eventuallt replace the huge data centers they are trying to build to train AI by the processor and memory in a laptop.
I am sure progress will be made and it will become cheaper to build, but I really don't see this eventually becoming that cheap that a guy with a few computers in his basement can create one from scratch.
In the physical world, you eventually hit a wall. Perpetual improvement is not a thing. Something that double in power on every update will eventually reach its limit.
youtube
AI Governance
2025-09-04T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy63kpjuhhgUMJY7jh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZEMQeXEHwivjfCvV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKumYzVw5xELudLKB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxKhz-1fpjD635mdmR4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMrG4KDqj0Zz9qLVd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugws8RSwTVfbiJ4EIfN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgydXxD5KENHlXg7r-d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxs1VIgK9XlsErCNxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgycMF7xVAmYqRtxfF14AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKBuDMhUknUqCUHNZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]