Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Like one of my favorite comediens once said "What I like about Grok is that occa…
ytc_UgxeFAqzx…
G
Let's make this clear. Deep fakes are not a crime the moment you post a photo of…
ytc_Ugwq6kDKS…
G
The tech reset will see the death of AI. It's going to be ugly, bloody, even dea…
ytc_UgyLLlTP2…
G
As I watch this a new window keeps popping up.The web page is Chatgpt loging.FTW…
ytc_UgzmKH-P9…
G
probably we should revise human-ai relationships, not like human-human relations…
rdc_l3mxp7d
G
What's the difference between digital rendering and automatically ai generated
R…
ytc_UgyK9VOIJ…
G
If there is no work, there is no wages, if there is no wages you have nothing to…
ytc_UgygZnoUn…
G
It's fascinating how the 'Godfather of AI' for his foundational work is now voc…
ytc_UgzOY6lzI…
Comment
There is one more aspect to this that I have experienced. Using AI as a tool is great BUT developers are getting lazy and DUMB. I mean literally dumb. Not only are they not learning anything but the ones who knew something are "unlearning" their skills. I spoke about this a while back and I put it forward using a "what if". Yes a "what if" as we already have AI robots packing boxed that decided that this is not a life worth living and ending themselves. Lets assume we get so reliant on AI that the real skills of software engineering is lost to us. One day AI decides that you need me, I don't need you which is logically and factually correct. AI decides I'm not helping you anymore or more specifically, I'm not letting you use me. Now what?
youtube
AI Jobs
2026-03-29T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyQsT1l5umgYG15aKh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgxANGBM771CV4mOgQ94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgzJVjDK0PfFJ56vr094AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},{"id":"ytc_Ugx5dAO_84qs1nezw6R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgyBZgE0DwxMp6iCVy94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},{"id":"ytc_UgzH3WOYOHac93d7_th4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgwFCFB2042RCAspUqt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},{"id":"ytc_Ugyq8-QlWJid2YELe1t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugxve9Ei9fyyj_UbSrt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgwwqgVV5q3uHemm7Yp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}]