Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
\>see asmongold subreddit
\>"I'm sure they'll have a moderate approach to…
rdc_n24t7jc
G
Too late to stop AI. We created it and we’ll get what we deserve for doing it. T…
ytc_UgxyjGRlv…
G
Artists are lobbying against AI image generating tools agains using their creati…
ytc_Ugz9v_0Ne…
G
I love ai art, however i think the artists should win this battle. If we lose t…
ytc_UgxGUoKo2…
G
Everyone wants to believe that they can't be replaced by AI because human beings…
ytc_Ugx-ilxoQ…
G
AI is way better way to learn then the current system we have. It’s not stupid l…
ytr_UgxM-EiYu…
G
a ton of degrees floating a round put there. there are millions of people with …
ytc_UgyNOJvRJ…
G
We can't build AI that doesn't harm us in the context of a system that depends o…
ytc_Ugy9qnXfP…
Comment
Windows are improving all the time. Google's Gemini context window is about 1 million tokens. With big enough of a window I could decompile and feed entire libraries in, just haven't tried it yet. It's a matter of time until context windows become large enough to not restrict anything. Then our work wiill just be to shrink the libraries so they fit context windows and write documentations and syntax guides, then that's it, that's the only work left.
Also breaking up the large work into small chunks, like you would for a student at school, helps a lot with stitching things together properly.
youtube
AI Governance
2025-07-15T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugziz68Br_sgQx5azU54AaABAg.AKnumzFPYEIAL7edCBAN2h","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgylRaeZcIcHh1kltZt4AaABAg.AKliWedo12DAL7fuT9itZR","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx7RQDipTdo6n8efhN4AaABAg.AKhfRiYryXpAL7MnKGdzLP","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugya2fm58br8-MZjr_Z4AaABAg.AKfFk3n8ABJAKvvFdP-oSt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugx4Gwiqnluiqb9-gWB4AaABAg.AKeTSho7-YVAKvfk8JhM6w","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzWdhMrq9N0ShAsdbR4AaABAg.AKbp7uwRFYjAKcPGwertCG","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugyxzej8RyPnP1_YWgd4AaABAg.AKa4QKtnOocAKcMzvjK-Or","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyxYt-Nsex3IIUA3294AaABAg.AK_ruZETpIRAKa-rI_r4Wk","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwUSz7uvmjRjkeoWrd4AaABAg.AK_lHU0rcfKAKaQCWRNHNu","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzPq3nqDV53YPl-zXB4AaABAg.AK_bxIsYF67AKaSNf2RjKn","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]