Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
⁠​⁠​⁠​⁠​⁠​⁠​⁠​⁠​⁠​⁠@TysonJensenRight, we wouldn’t need jobs. And no, we absolutely will build it. We’re already like 90% of the way there. Many correctly argue that AI will first create a dystopia, and then what follows could be a utopia. The “dystopia” is first created from the fact a select few companies will own AGI, they will license it to every company which will cut costs down to almost nothing, mass unemployment, etc. However this is inherently economically unfeasible as there would be no income for individuals to spend their money as they’ll simply have none. However instead what will almost certainly happen is a dystopia which is backed and propped up by UBI, which is almost worst because it would still provide funds for those companies. Imagine 90% of all wealth goes to these companies, they get taxed at 80%, say half of that 80% is put towards governments utilizing AI for weapons (Again, licensing it from those companies) and the rest gets recycled back into the economy via UBI, and the cycle continues. We are already in that cycle NOW, just to a far less extreme degree. but again, eventually it’ll become so cheap that everyone will have AGI. Meaning it won’t be just 4-5 companies that have it, it’ll be hundreds, and then thousands, and so on and so forth. We know this to be true based on China’s deepseek AI which is completely open source.
youtube AI Moral Status 2025-08-06T01:3… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxyZpoppVranDY7zdV4AaABAg.ALUj3kwgr1pALUm5AdXv6n","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_Ugyo-r4nOuF8rtdyp4l4AaABAg.ALTOlKx9oEfALVjUjj42aw","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugw37CTH40v442OwJMp4AaABAg.ALR3Sft42c_ALRoOSPgABJ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw37CTH40v442OwJMp4AaABAg.ALR3Sft42c_ALTCago21CH","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_Ugw37CTH40v442OwJMp4AaABAg.ALR3Sft42c_ALTEZNNRZEJ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw37CTH40v442OwJMp4AaABAg.ALR3Sft42c_ALTIOHD1xQT","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwEinemBwSlRmsxiqV4AaABAg.ALR2uZRCSyDALUAiPgSnsi","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgxJKtg5-l_40EKucoF4AaABAg.ALIEPaDfgvPALUFgJzPDRx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugw8p1yrNpKf61o7F-p4AaABAg.ALD7_eoovHAALD8YFlW2Y_","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugw8p1yrNpKf61o7F-p4AaABAg.ALD7_eoovHAANVi0sxlkUW","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]