Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's because AI struggles with unexpected behavior even and especially when th…
rdc_o7d76hu
G
@stephanos6128 Maybe. I'm not sure. I did because I had no choice. But I think …
ytr_UgzMnPX-9…
G
I think technology emulating the work of humans is a reoccurring theme in histor…
ytc_Ugz3YYr_v…
G
Tesla robot 🤖 walks faster and has better balance than the Xpeng robot 🤖 from Ch…
ytc_UgwjpmFDt…
G
What most of these doomsday scenarios seem to miss is what would be the AIs moti…
ytc_UgxioKzRz…
G
LOL... I had the "Learn to Code" discussion with Grok this morning - what that …
ytc_UgyYPKQ6B…
G
I hate that feeling I get when I find out something is ai art because it looks r…
ytc_UgyxzkXvy…
G
Do not ask AI to solve the problem of where the data centers will get electricit…
ytc_UgwgjwzYQ…
Comment
I've spent the last week going through documentation for libpcap, writing a wrapper for it in Swift. The number of times I asked for a simple explanation of something by Gemini (google's ChatGPT) and got something flagrantly wrong, is depressing.
It flatly told me that there was a library on Github that didn't exist. It told me a library that did had functions it did not. It tried to force me to use those made up functions. etc.
Eventually, I got fed up and just had it generate bad C code so I could translate what I needed to Swift. I was better off just reading the documentation and googling it myself.
If that thing has taken anyone's job, then they weren't meant to be a SWE.
youtube
AI Jobs
2024-06-16T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxOJ1Bqz9f_j-iTud54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTnW4vquOi4snLNJF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgydFmTm9V4-J4meRKV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzuiq22Gwud5Hslo8l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxujXQpBCPMLbkP-e14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNyMXuVvt-4SFLT6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykdC9nIDetM4BNT4x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwSPgZsOGpesmz0gB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-URfr-m_vFqqAHS54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwbqBOXYCewz6K1Lc94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"}]