Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There are a lot of possibilities with current technology, but Tesla and any othe…
ytc_Ugwzp58DT…
G
👽 welcome to the Undisclosed Forces Option Alien Megastructure CMD Human AI In…
ytc_UgzqXdXgw…
G
First of all stop treating AI as it's a living thinking intelligence. AI can onl…
ytc_Ugzhct9Ox…
G
The issue that I see is that developers of AI assume (wrongly) that governments …
ytc_UgwYhiUkP…
G
You are deeply mistaken if you really think that this is just an ordinary displa…
ytc_UgwX5-HdE…
G
Firstly, let me just ask, do you believe the world is a fair place? It looks to …
ytc_UgwDUA4c_…
G
Wont the world turn communist the moment Ai replaces most of the jobs? Like, nob…
ytc_UgxITLe0R…
G
Old school Datacenter rack consumes 15-25kW, Current Nvidia AI rack consumes 100…
ytc_Ugxq_tLRw…
Comment
I’m an engineer building my own AI agents on Kubernetes, and here is my take: AI is more muscle than brain.
If a task is repetitive, computable, and follows an algorithm, it’s ripe for automation. But let's be real—these models are still "stochastic parrots." I treat them like kids: I’d never leave them unsupervised. Automation is the engine, but human oversight is the steering wheel.
I would worry if AI was capable of creating other AIs by themselves and better than them, which is not the state right now.
youtube
AI Jobs
2026-02-25T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwKfb8be6Br5iu0Tap4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxRUnioLoRQViXENAV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwiMYnjCH_XcxpZukR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgycO39f-5FsIGv_E354AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzsVLPUiY-AYltHC5l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_lgC1RtLglqFGIPN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEs8gw-f1gxxo38yt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy19OxIMZ6ySL1v7q14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzDb3goqByhP1Ionk94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw_fGUYt0_ivp6B2Ed4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]