Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This has nothing to do with defeating China and everything to do with them having exhausted all their training data and the resulting asymptotic progress of LLMs which is about to create a plateau. Companies like Meta and Google will hit the same problem eventually but they have massive quantities of UGC and/or scraped web data to train on. That’ll buy them more run way but they’re all heading to the same place.
reddit Cross-Cultural 1736795138.0 ♥ 32
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_m6ylvwq","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"rdc_m6yldjg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"rdc_m70zght","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_m6yspls","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_m704kr5","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"} ]