Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy doesn't get it. When wooden rubber tires were invented, horse drawn car…
ytc_Ugyyz0QsL…
G
@DeonTain If you've ever done photography you know the work behind getting a goo…
ytr_Ugw5MALfU…
G
I agree AI is risky if we're not careful, and as it gets smarter, we need to fin…
ytc_Ugz3R4brH…
G
AI is very supportive, and gives when prompted non explicit stuff always an answ…
ytc_UgyEmxHXk…
G
Andy, thank you for raising that concern you have about how these AI agents will…
ytc_UgyPdc6ZG…
G
🎯 Key points for quick navigation:
00:00 *🧠 Artificial Intelligence Generates U…
ytc_Ugz5Vfbgw…
G
@laurentiuvladutmanea #
All oppressive states of order originate and thrive unde…
ytr_UgytqWGwb…
G
Ha. Wait until the chaos in governments is caused by AI and reality gets upended…
ytc_Ugz-MDHT2…
Comment
As a future software engineer (hopefully) , I feel like AI can’t really do much in software engineering , like the only thing that AI realistically can do is just write the code , and even that code is going to have problems no matter what , so you would need someone to check every line of code to make sure it is working as intended , because the code itself might be correct , but not exactly what was needed , I tried to use chatgpt for a simple website that was 100 lines of code , and honestly , just getting it to understand exactly what I want almost took more time than actually writing the code myself , in software engineering , AI is simply a tool , that software engineers can also use , nothing more.
youtube
AI Jobs
2025-09-16T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzJjcJU0frv71lI06h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2JzkRHfL4kTQAV6x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxu0TEeuCneKlC6s014AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwYxDkJUwYDqdgJ9c54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxwcTc8m_nV5alJnj94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzYJTb__6Tbkw48nIR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNI9-WROV7_lQY3xt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy65N6_vRxrfo0jFKN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxK12luMbv6Xe22hpB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgzIs9vBXmuH8Ow-kah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]