Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artist would let others learn from their art style but they would let an AI lear…
ytc_UgzVsBznd…
G
He's rasing the alarm because people in leadership roles will be very interested…
ytc_UgxBCYtSe…
G
He is just spouting nonsense. He has no clue. Every new hype cycle needs a blue …
ytc_UgzFoW2Yu…
G
I like AI for a lot of reasons, but I strongly dislike the way people want to ab…
ytc_Ugyp728_n…
G
FAKE! Nothing like rotoscoping out whomever the fighter was that actually knocke…
ytc_Ugy4rG6gu…
G
What makes AI seem so bad is that all the rich people/ entities working on it ar…
ytc_UgxOHH7vT…
G
Let's be honest - with Microsoft's track record on black facial recognition, BLM…
rdc_fupz4id
G
Learn AI, so AI companies will be forced to upgrade their systems to handle this…
ytc_Ugz6PHbsU…
Comment
I use AI to learn concepts and generate low level code, not to vibe code. The magical part about using AI is, as long as we have strong concepts in our heads, we can learn MOST programming languages fairly quickly. It’s a great tool to speed up the coding process for developers who already know what they’re doing. With concepts under our belt, we mainly use AI to generate what we want as concisely as possible. We piece these concise code together make our stuff.
And maintenance is also an issue. Let’s say I want to add new things into the app in the future. How does AI know exactly what’s the most optimal way to make the code more extendable? You still have to make the decision yourself.
But one thing’s for certain. With AI, we no longer need that many low level developers anymore. Anyone who says AI will replace us is likely someone who has never done software engineering in their lives.
youtube
AI Jobs
2025-04-21T02:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzNGT6uTQlO6wmwnix4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwFP7-BnoqJLk-q5vt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_5oKPQzbVYYNFdZN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyg1rAFDArFtNQrMal4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7rqJ9gC8XPBv8rYZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwlF2h72Q2Uy0LvHz14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxwrMIEWroHG04uwgV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZPWfLnRwb_VZj0994AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyFbwUn6W9AvkKXcgd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyPN0IMI54avS_kzad4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]