Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Another copy-paste bullshit video. Or probably I should say: another AI generated slope script. 2026 just started, and there was no significant progress. Claude released Opus 4.6 which, in many cases is worse than 4.5. OpenAI released Codex 5.3. And that's about it for now. And the story about "AI replacing developers" is getting really old. Each month I hear it's "just 12-18 more months until replacement". C'mon, ask that AI to generate you some new catch phrase, this one is really worn out. I'm working with multiple companies, small startups and large worldwide corporations, and EVERYONE is writing code by hand still. Not one software company I know found AI massivelly useful, not to mention replacing software engineers. And studies from MIT and other respected researchers backs that up: developers are NOT getting better or faster when using AI, they are significantly slower in fact. Spending their time to prompt, review and correct AI. I see this on myself - whenever I ask Claude or Codex to write something, it usually takes more time-to-result, than I would spend writing that part myself. And that's given I will not just throw away AI generated slop to trash. So please, spare us these bullshit revelations. Sure the potential is there, but it's virtual and not yet a reality. Maybe never will be.
youtube Viral AI Reaction 2026-03-09T14:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz26Eja1TThroUpPEZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwAfCr1czjEe-0SOzd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyUhu2ccs3h6r1M-DF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw8N_GmoxWhmFbv5pZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgySHraBUp1wga3oLuV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzXe26Gw5G6f7jwv-54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgxVErfCQMJQHF-CoD54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwlid1UaXmuywBfB_t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwzLlXz_pva_RcKa4N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxBYeWCCk_o7hFoSgN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"} ]