Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The company doesn't exist to provide people with pay and benefits. It exists to …
ytc_UgwjMZryy…
G
It all seems WAY TOO SAD AF knowing that the "plan for AI" was to have it on a c…
ytr_Ugx8v6xyf…
G
Ah, yes, Elon musk who makes his cars utilize the cloud and is against The right…
ytc_Ugz3CrdK7…
G
Let me take it up a notch with what the Queen just said.
Montgomery county, MD…
ytc_UgwDcmiWV…
G
America is full of easily solvable problems. AI will just become the worst of Am…
ytc_UgyypXm5k…
G
You’re not an artist if you literally use AI to generate the image bro 💀…
ytc_UgwXhGkPp…
G
So tell me again why we need to have more kids to populate ...as per Elon Musk? …
ytc_UgyXGwroB…
G
Interesting ChatGPT response... skews what an anagram is... why is not ´Satoshi …
ytc_UgwTu98Uv…
Comment
It's new, still working out the kinks, but tbh, stuck in a loop, I've yet to see. We have a well-structured planning system.
Tokens, I use Claude Code Max 20, so I don't really have to worry about tokens, but we do consider them. We're always looking at ways to reduce tokens. Watchdog is a good example, rather than have a detailed prompt instruction, it's now a program, so it reduces token usage on that end.
We try to give agents just enough awareness to know that something exists — tools, a process, etc. When an agent interacts with email, it has general command knowledge, but how to structure your send or response context is only presented when the tool is being used.
Drone is our command routing. Agents don't need to know 1000 commands, paths — drone @agent <args> resolves it all for the agents, so that alone is a massive token saver.
I also have an AI language compression system in the works, AIPL, AIPass Language. For AI-to-AI and AI files, rarely or never looked at by a human. Seeing around 7-40% token reduction in daily activities. It's still human readable, but more suited for AI, so if you read your agent memories, you'll understand but be like 'not for me.' Haha — and that's the whole point. Your outputs will be totally fine
reddit
Viral AI Reaction
1776963151.0
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[{"id":"rdc_ohsst2w","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"rdc_ohus8ki","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"rdc_ohunuq6","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"rdc_kskdx2l","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_ksmawjs","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]