Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ok but how does AI tell the difference between slightly annoyed and slightly con…
ytr_UgzGdqLeA…
G
"We are cooked" is the defeatist framing capital wants you to internalize. Mass …
ytr_UgxWRqFWs…
G
Lol so nobody looked at the movie terminator. The scientists did the same thing.…
ytc_Ugwcb9JuB…
G
the takeaway is clear we now have to use AI to defend against AI. Its crazy that…
ytc_Ugx0Rx_PD…
G
Of course they should've rights.
I thought humans would be mean towards Ai and I…
ytc_UgzuKT3-U…
G
As a programmer i hate when people call neural networks(later - NN) "AI", they a…
ytc_UgweGnP56…
G
lmao wokies coping that ai will replace them in 10 years and their shitty devian…
ytc_UgzjrqjEL…
G
Universal income will be the only solution. And tax the AI companies higher. The…
ytc_Ugyv8dSWy…
Comment
This is not how software engineers use llms in their workflow. They use it as a tool to speed up their workflow, you tell the llm what exactly to do using technical computer science terminology and ask it for input here and there about implementation; no engineer in their right mind would let their entire code be dictated by ai. It has always been context engineering.
youtube
AI Jobs
2026-02-24T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxmld8P1setEUiV15B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_MT8fUWaWP73_BRd4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzE-Hh3JvScxvadqrR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzGWS7YRGW-MYmvZTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzEDNrQ4kIPrzam6UN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyzUJHbNSDFJlo8l-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUyD1mD3rcZ8bqYEJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxD6JUdLcrm8QFBofV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzphHG2u01eZwccsph4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkPD8Fiwo9SLQ3eox4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]