Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I've spent the last week going through documentation for libpcap, writing a wrapper for it in Swift. The number of times I asked for a simple explanation of something by Gemini (google's ChatGPT) and got something flagrantly wrong, is depressing. It flatly told me that there was a library on Github that didn't exist. It told me a library that did had functions it did not. It tried to force me to use those made up functions. etc. Eventually, I got fed up and just had it generate bad C code so I could translate what I needed to Swift. I was better off just reading the documentation and googling it myself. If that thing has taken anyone's job, then they weren't meant to be a SWE.
youtube AI Jobs 2024-06-16T18:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxOJ1Bqz9f_j-iTud54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyTnW4vquOi4snLNJF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgydFmTm9V4-J4meRKV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzuiq22Gwud5Hslo8l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxujXQpBCPMLbkP-e14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwNyMXuVvt-4SFLT6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgykdC9nIDetM4BNT4x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxwSPgZsOGpesmz0gB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw-URfr-m_vFqqAHS54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwbqBOXYCewz6K1Lc94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"}]