Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Bleach-t6x As of 2023, AI illustrations emit ~310x - 2900x less CO2e per year …
ytr_Ugz2eYuA5…
G
You know what’d be funny? If some of these automation companies were just minimu…
ytc_UgwTp54_4…
G
If AI images were sustainable from an environmental standpoint and didn’t steal …
ytc_Ugw9fM9xf…
G
Canadian here. I’m not eligible for any of the covid benefit relief whatevers be…
rdc_fn5kqp1
G
Quand ces Magnifiques robots seront reprogrammés pour détruire toutes vies sur …
ytc_UgxIkfar0…
G
Stressed with existing AI tools because they are too complicated? It's time for …
ytc_UgyW09Teg…
G
They just wont stop bro its like thes ppl wake up to be a step closer to a domin…
ytc_UgyC6nOxc…
G
we couldnt even make safe plastic, in fairness, we COULD have responsibly used p…
ytc_Ugz8gw2h4…
Comment
the token burn on garbage HTML is the part that really hurts. the actual scraping failure is one thing but feeding that noise into an LLM that happily processes it token by token is where the financial damage happens.
two things that would have saved you here:
1. circuit breaker on your agent loop. set a max token budget per URL and a max retry count. if the extracted text has more HTML tags than actual words, bail out instead of re-trying. a simple heuristic like checking the ratio of angle brackets to alphanumeric characters catches 90% of captcha/garbage pages before they hit the LLM.
2. pre-filtering before the LLM. tools like trafilatura or readability-lx can extract clean text from HTML without any AI involvement. run that first, check if you got meaningful content, THEN send to the LLM for structured extraction. cuts your token costs by like 80% on pages that do render correctly too.
the build it yourself trap is absolutely real. i went down the same road and eventually landed on firecrawl for most things. not sponsored or anything, just the pain of maintaining custom scrapers vs paying 20 dollars a month for something that handles the edge cases is not even close anymore.
reddit
Viral AI Reaction
1777057823.0
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_oi0ni22","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"rdc_oi0oy9w","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_oi1bcoz","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_oi2fibt","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"rdc_oi3ppej","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]