Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A pencil and a paper.
They're not AI Image generators,
but they are the best way…
ytr_UgxtK3lmO…
G
AI is already running out of computing and energy resources in it's current stat…
ytc_Ugzqfse_W…
G
AI can't do the blue-collar jobs and maintenance of infrastructure. I'll be safe…
ytc_UgyP1v1ly…
G
Well there's a Nash equilibrium where ai is owned by a capitalist, and no Prolet…
ytc_UgxPXJu-b…
G
ChatGPT can’t even factor numbers. The only reason it’ll destroy us is because w…
ytc_Ugwk9fX7G…
G
I am starting to think AI is actually not good for humanity. It is clear the goa…
ytc_UgyrKIcyz…
G
AI is coming for a lot of workers in the coming years. Better get a job in cons…
ytc_Ugz0mVzXw…
G
most of the posts are AI-generated. I think, in general, people are still sane …
rdc_mdjir1p
Comment
Ur CPU is the bottleneck. I casually run 10 agents with say 2 subagents each, no problem on a shitty PC — but I'm on Linux. 30 agents no subs is my max. And 1 agent with 50 subagents is another max. Clocks around 80% on 3.80GHz and 16GB RAM. 12 cores.
If u have a mote power machine, a local llm would be nice. I will setup for local llms in futur, when i upgrade or just get a tiny model to test with.
I use subscription models with no issue.
reddit
Viral AI Reaction
1776961015.0
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[{"id":"rdc_ohrxxj5","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"rdc_oht18qn","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"rdc_ohy73aw","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"rdc_ohudqfd","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"rdc_ohufw67","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]