Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I am a senior software engineer from Germany, and I honestly am ashamed that even here in Germany most software developers did not understand that AIs can do CODING but not ENGINEERING. I never cared to explain this to my colleagues - after all, I quit my job as a computer scientist at the university almost 10 years ago, because I was fed up with the dropping quality of the German education and academia system, so I just wanted to apply my knowledge and not care about incompetence of students AND supposed "scientists" (who could not implement hello world in Windows Batch). Every software developer knows, that corporations totally fail at specifications. Therefore, the last person in the chain - i.e. the developer - has to take design decisions every day. Which is normal and expected to some degree. However, often the developer even has to take design decisions for the functional specification, which he can only do because he is human and understands the context of what is corporation is doing and how the customers think and behave. AIs does not know that. If you want your LLM to behave like a real human employee, you'd have to onboard the LLM the same way and feed it the same information - just to realize it still is not behaving like a normal human. This is the truth - just like normal (imperative) algorithms are good executing known tasks, generative AI is only good at handling known scenarios. But guess what - scenario specification is one of these things corporations totally fail at, when specifying their products.
youtube AI Jobs 2025-12-18T10:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxQB_Ip28XAjdsU5GR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgySfnmtkeMqR_tsgsl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx6KluDwvHKiZx63BZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz4mtWmdrjPhmfLRad4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzJG4k1wp2ZiVyq_vJ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgxBvsmRnbUFl-76H2F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxetYSato0M21TT_T94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzz5SwEVeI-kpmc9oh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxu18P9Qjf2hZ0Eoh14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwMm-Jx_vpMGNlouI14AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"} ]