Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As we all know, the rise of Ai and autonomous systems (robots) are on a rise, eventually they will take over 99% of jobs. 80% or more of people in higher positions than us want to keep the rich, rich and the poor, poor. This doesn't sit right with me. They need someone like me, a humanitarian to orchestrate a plan that takes us from living in survival mode to living a life of abundance. This will require a "New World Order" not in a bad way, but in the best way possible. We have all the people and resources here on earth to put this plan in motion. We need companies like AstroForge to continue there work on mining smaller asteroids, eventually graduating to mining 16 Psyche. Producing Von Neumann Probes that can mine and replicate themselves to mine more asteroids without the need for humans. Storing precious metals on the south pole of the moon. Then once we have enough autonomous systems (robots) we can assign every family a duo bot system, providers and caretakers. The robots will build cities for us all, create fusion energy plants, give us all free energy and wifi, deconstruct larger cites, recycle and distribute goods and give 50% of the earth back to animals. By then, we will have a new currency that isn't about money, but about your contribution to your community. 3-4 hours a day, 4-5 days a week and you'll be able to have credits for exploration, luxury and more. People won't be in survival mode anymore. They would live a life of abundance.
youtube AI Governance 2026-03-17T22:5…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw48ieZigxmgE0Mupl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzbwf8gexX0zwQqdml4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzKrdPiIhIZcwB05qN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyWfjnCbjyX-LaF9Ph4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzmeiR1aeND6XuMKl94AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzNzaq5C4EApMRLVWd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyEPiTCJT2ffJJFrXR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzj-9JTyekcZaoi4gB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyOoCyLW4Y6y9yNPNJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgxBjIRpwTwIj80yW114AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]