Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So my takeaways to this are: 1) wall off AI from biological weapons systems or any military weapons systems, period. AI should not be allowed to be involved with WMD's; 2) "No" scientist has ever come to the conclusion that the value of art and creativity in society is that it is generated by us (or elephants or dogs, lol). An AI system may be able to find unusual patterns and apply them to artistic endeavors, but the ineffable soul or humanness or whatever that we bring to art is irreplaceable no matter how technically good an AI generated song (for example) is; 3) Unless we build machines that can manufacture new devices completely autonomously, AI won't be able to manufacture anything. It can't mine materials, it can't physically deliver them, it can't 3D print them and put them together. So let's not do that for "them"; 4) The idea that we have no value as people unless we are working is such a capitalist notion. If we had a viable/functioning Basic Universal Income, we'd travel more, we'd pursue creative endeavors, we'd connect as people more, we'd spend more time in nature, we'd continue to invent new things, cause it's kinda what we do. The whole planet wouldn't become a Wall-E yacht of fat lazy idiots (just some of the population would). Yes, it's important to have goals/motivation, but the idea that it ONLY happens if we work a job is a little bit of an Arbeit Macht Frei mentality; 5) It will be important to continue learning coding/computer science (or any science based studies, really), so that we know how to hack AI or rebuild society after AI tries to destroy us and we have to rise up against Skynet; 6) Archaeology will still be a valid career path in the world of AI, lol.
youtube Cross-Cultural 2025-10-21T17:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgxqCZOGv9DiME_jLx54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzi8xG-fvXotj4eX_d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugyi5fdMUS9Ga8XZ5G14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzvzREQwWa6uPaEPYt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzBg63KFbqA1HKxgAh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwyuO246GWa8IKvx0h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugx5oQl9dIUR4aI4m3l4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy4MimEydhsVnJmf1p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzdD0pml1cwBlr1-nV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxDzUtG8wiHAgCq8qp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"})