Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Although I haven't Chatted extensively with various AI Models, I absolutely agre…
ytc_UgyKRraNS…
G
Okay, but ChatGPT told Alex that it had fun talking to him. Having fun involves …
ytc_UgwQ3KqP5…
G
Let’s just pay the 10% more then the 90% aren’t needed and can be sent to be rec…
ytc_Ugzb684aI…
G
Probabilistic reasoning is what AI uses. Humans are using symbolic reasoning. It…
ytc_UgwGRzmQQ…
G
I try to stick to two follow-up prompts MAX. At which point I draw the conclusio…
rdc_n4eaxw7
G
Everyone saying we’re doomed but as long as AI don’t become sentient and don’t g…
ytc_Ugwrkf4d_…
G
lmao I'm smarter than a bunch of ai devs imagine thinking humans would go extinc…
ytc_UgyL4i8Ve…
G
AI can't create anything new, it literally takes art from artists and slaps it t…
ytc_UgxCkXpWa…
Comment
I agree. As a professional software architect, I find it hard to foresee AI making creators obsolete. There’s a significant difference between writing a few HTML pages and thinking you’ve built a website, versus developing complex enterprise applications. To me, AI is simply a highly useful tool—not a replacement for skilled developers.
My concern with the idea of AI replacing real developers lies in the misconception held by some managers. Those who pay for ChatGPT subscriptions often believe it’s feasible to replace developers by simply typing prompts into a chat. It’s both amusing and concerning to see such a belief, as it’s far from reality. Unfortunately, I’ve had the unpleasant experience of dealing with this kind of thinking firsthand...
youtube
AI Jobs
2025-03-15T15:0…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyCs2oEyMhQjwG7iKV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwZjGw_xDs75NGbiXB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwW3HGEsDa5hGbEr654AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwwdmLLi_feB40DuCN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxT-38Kp8k6ih77RZB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzm9k0NINhAZ9cwcSh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkVrrUM0XeHb76y5R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxlrOVK4ziJ7Qyp2kl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweZ1srBDUtmxjtu7R4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-Ft7vNg_Im-dPJ1B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]