Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i think if i would be taught by a robot i will pretty much be a robot :) , knowl…
ytc_Uggf_xbgM…
G
So im guessing employers wanna go automated for pretty much 100% profit. OK so h…
ytc_UgyrYcDAK…
G
There is another possibility to why they were so willing to backtrack. Who's to…
ytc_UgxrR749v…
G
"Is AI Coming for Your Job?"
It already did. The owner was already retiring, so…
ytc_UgzgpzwS4…
G
The best part of it is. He didn't make it. He can copy right the A.I Software (I…
ytc_Ugw9ekpPS…
G
What will the truck drivers get? They will get a kick in the ass. Joking aside d…
ytc_UgzO14RWc…
G
Good man don’t sell it least of all to this asshole . AI in the wrong hands is d…
ytc_Ugxjvd9tp…
G
Nope democratizing art simply means making it accessible to people who either ca…
ytr_UgysakA09…
Comment
Just remember one thing: AI doesn't think, humans do. If you blame the AI, then you are delusional. If you can't read generated AI code and analyze it, then you're not a programmer; go back to the basics, and then do your job. People don't understand that it's their fault. When using AI, you become a debugger. If you understand code and are able to read it, then nothing should stop you. If it does, then you're doing something wrong, not the AI. its just easy for humans to blame AI rather than themselves. a 10 year senior experience programmer wrote a blog about how he wasted time on vibe coding, not because of AI, but because he admitted not thinking or reading the code. Another blog about how to be a professional senior programmer said, "Don't blame the code"
youtube
AI Jobs
2026-02-04T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQ33uQHDuqbRTs0Xl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxnuDWs3LNKYUTYNa14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwpTKjUxAJMDr7J9a14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz47lmGgJIepfZYdat4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdouWYlroI5dicLfp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzoPC_RylaHqtszkGx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxaTOo-G1YRVFcaRBZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzmpjmuNotCfcllS794AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTNMuHg-M0W6HklLp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwSFZW0OH1ioWEa4kV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]