Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So im working on a multi agentic workflow at the moment as the frontend lead. We are moving to support AG-UI events in addition to our own. Bleeding edge, however the authors of the docs give very few recipes and the ones it gives use copilot kit that abstracts it all and even then it's riddled with unused declarations dead code branches (in other words a hot mess) Anyways i spent a full day in claude opus going back and forth in plan mode. Only for it to get confused and in my event buss on the front end start creating and emitting it's own events. Oh then naming as well the buss that handles and pipes events to the correct place on the client (frontend). Is also called an Agent. Like the term is not overloaded as it is. Just call it what it is. But then PR team couldn't spin it. Nuance that comes second nature to a human engineer is completey missed.
youtube AI Jobs 2026-02-19T01:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugyf5xQ8whtO3HQl8nB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzpaWdABbZx8sGwmE94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugzlkng19zd1sCp0uBx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxGo0j_N27_lBgy6v54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxpoKwQD90Jmx6vbbB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwOc_WW4Kin5Q593uR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgwUV1mjOgCfwZgz9aJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgyucpGuRzlZW_0vX814AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwtKSmvRswzwoQM_Ft4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},{"id":"ytc_UgwrjnoRZYlxyEGhaQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]