Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robot dog is shitty, but body cams, DNA analysis, etc have been great tools for …
rdc_jg1c8ks
G
What we thought:- AI would be conscious, intelligent and won't be able to lie, b…
ytc_UgzF-pcvo…
G
In a vacuum, I have no problem with Ai art, but technology isn't developed in a …
ytc_UgwBijU0p…
G
This is sickening! This is the very end of greed! People who are thirsty for wea…
ytc_UgxgzaZM4…
G
A good video about the A.I Revolution we are now living and how this is making m…
ytc_Ugy590wBS…
G
I'm a fluent Japanese speaker who's lived in Japan for decades...and I can say w…
ytc_UgxrIf5um…
G
There is two possibilities :
- Either its just a projection from him, and there…
ytc_Ugx9Yp3pB…
G
I dont want to ever see a robot like this in my Life! We dont need them. Why do …
ytc_UgyDkR7sv…
Comment
Try to do real deep debugging sessions with AI and you will understand the problem. Most LLM will always find something random what it considers as a problem, but it will rarely find the real bug. Additionally LLMs usually can not see underlying complexity. They may can find the problem in your code, but that you can find it anyway if you are profiling and interpret error messages. I see LLMs rather a turbo charge for engineers and not a replacement. Company greed for free code is unlimited however and the last 2 years ignoring juniors will cause a future problem for these companies and eventually they will pay the extra.
youtube
AI Jobs
2026-02-05T10:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgynWYnHlV04MZk3RAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzOlDjskdM-perTZ-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2wRu3T9sE4U8-J1J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx7I5DpYoNLz-ZqmT54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwP8Gr4v-24_yfqShd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwbCd5mlBMrvDPwdvV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwB_U1eU7G-ttvpZp14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwiBji6d3FxlpIy8K54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw5XgBxd0HG-AuCXu14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzKzQ1xa5tquKJc6sB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}
]