Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
At my big tech company people put 10-15k+ LoC in single files. That way AI agents get stuck trying to navigate through the file and humans are still required. You can always double the file size to 30k LoC faster than Nvidia can double the amount VRAM in their GPUs.
reddit AI Jobs 1752724449.0 ♥ 18
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_n3kk6lp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_n3ldlow","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"rdc_n3kr9mp","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"rdc_n3k8jsd","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"rdc_n3kczf7","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]