Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
what i dont get is that people actually admires this ai imageries, its nothing b…
ytc_UgybaAnk-…
G
My phone is for real watching me just wrote an essay about ai and this was one o…
ytc_UgxYxFBYg…
G
Because "They all look alike?" Facial recognition is RACIST! okay so I guess it …
ytc_UgzbJmfk5…
G
Don't forget to sue the FUG out of the software maker.
Because a report was done…
ytc_Ugx07Tsiy…
G
Get all your money from your bank account before this AI code get's in producti…
ytc_UgwePDAih…
G
Learn to use AI, that's not going to get you a job because in 6 months everythin…
ytc_UgyzBKIVu…
G
This is so utterly idiotic. We are not ready for this technology, this is the wo…
ytc_UgzmDPfxc…
G
Ai has yet to develope a sense of humor and I don't mean repeating old jokes.…
ytc_Ugyb-N9TD…
Comment
Hallucinations happen when the conversation begins to represent too many different concepts, the LLM has to integrate the entire past conversation to keep track of what it said in the past. To solve the problem of excessive input to the LLM summarizers are used to consolidate the past conversations, but this approach begins to distort the past conversation as the summarizers leave out details. The effect ends up causing the LLM to respond incoherently, or repeating itself. There is way to prevent that and make the LLM more robust in dealing with shifts in context of a conversation and coping with detail loss from summarizer's. given that, achieving AGI is possible, particularly when Mixture of experts architectures are used.
youtube
2025-11-14T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwks4mHM8WXQqfftkl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxrvpruH7MiJ7j2RLF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx8dHINYRvJiTlrUxF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDBn6JOeeSNhT2sG14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxgsFY8bEAmEEnB8ZN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugzr1gtGJmdgyJYlnxd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwN7JL1DDWTh9XUD5R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwKgT96rpCtbUZs7ZB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw08i2BIu3ZtgFV4G14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwteu3kffLuiwMBnh94AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"}
]