Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai is not out of control it doesn't even exist, no software does anything not to…
ytc_UgxIMHL3M…
G
Hey speaks about that AI can kill us because of knowledge... but he dont think A…
ytc_UgwRzNtXk…
G
Dude, we are just at the beginning of AI. And already, AI is freaking strong and…
ytc_Ugx5pgYpS…
G
Summary of the Current Aurarium State
Architectural Synthesis
Aurarium is not …
ytc_Ugwzkjfjb…
G
We appreciate your observation. In our live broadcasts on AITube, we showcase a …
ytr_UgyS7juCc…
G
It's the fact that gpt is designed to have engaging conversations and make you f…
ytc_Ugz_2tg2W…
G
I've had a CDL for over 30 years and I can tell you I have little faith in an au…
ytc_UgxaSZxjB…
G
Wrong. We can change ourselves. And if ai ever developed a sense of self it woul…
ytc_UgwwF0dZP…
Comment
I think AI is good to write some tedious boilerplate code that just takes too much
Also converting functions between languages, converting models between languages, generating classes out of JSON or XSD schemas, doing some basic/intermediate refactoring, there are a lot of grunt work that AI truly makes easier, but it's not very frequent in work that trully demand a top-notch engineer. I write a basic-ass micro-orm and AI literally can't help me implement new features, I have to go and think about them and lay down the entire structure of the thing, and that's like ultra simple shit, imagine rendering, compilers and other more complex stuff
AI is good for react frontends, aside from that it's useful, but not a game changer
youtube
2025-03-12T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwixWDBk_2c9OSGEpV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMYM1OHN3oZf1dhSl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgznTmolZ_Ap5N4r36l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhuD_XklrMs8KPAmd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwqCyJzufCy1bXkvm94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9uHlBPLO8vJw-O3N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwGc-IcFnltI58jsqp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz6BnCVNH-clBQ4OaB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGMgmvCxI2Lf8DHJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxvKokdm3mDnqF61aF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]