Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Have you watched any of the videos on YouTube (other than TED)? The people who created the AI framework now realize they have opened Pandora's box. Scientists have given investors (both good and bad) the golden grail. That is to replace humans with aliens. From 40 to 50% of a typical company's operating statement is payroll. Corporations driven by Investors, or investors driving AI companies, are pulled by the gravity of lowering the percentage to 10% (while filling their pockets). Sure, they will tell us that Renaissance awaits us, that they are empowering American workers with Superpower (Palantir), and they even use the term the GOLDEN age. More capital (the blood of the alien) flows into data centers, new chips that process faster, and a bidding war to hire the next great scientist. If we define ASI as an alien that is not biologically a human, the alien will be knocking at your door before 2030 (a conservative view). Don't act surprised. But can we slow it down and create an ASI that is more human than alien? First off, if an alien appeared on our planet, we would convene a worldwide national security commission to determine whether the alien would harm us. If we took that step, it would slow it down. Next question: Can we agree to extend the building of ASI, which is more like a human than not? This can be done, but it may take two decades. One party that we must get to slow it down is the people we voted for.
youtube AI Jobs 2026-01-06T19:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugw5ERhyZHqwyOYef6Z4AaABAg.ARdlBLlz31aARe9v5iBdPC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugw4uTR1ugsNZ_iJAOp4AaABAg.ARdk0zsAfCjARdmP_IO7dQ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw4uTR1ugsNZ_iJAOp4AaABAg.ARdk0zsAfCjARdy0NVNC4W","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytr_Ugw4uTR1ugsNZ_iJAOp4AaABAg.ARdk0zsAfCjARe8-GT0jH-","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugw4uTR1ugsNZ_iJAOp4AaABAg.ARdk0zsAfCjAReTgY9Ovn9","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytr_UgwSF5uPDiCEAio4xop4AaABAg.APUHLms18SxAV3xx4EvPaZ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwTxo_vixP0UBCgZxR4AaABAg.AUnYC9n7AbJAVAFgvjH1M7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugx3sLKS_rZVlCSdqYl4AaABAg.AEV0oyNtbZnAEXzefuV_TA","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugx3sLKS_rZVlCSdqYl4AaABAg.AEV0oyNtbZnAEYsng_uRXw","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"}, {"id":"ytr_Ugyep5BQKf8fRnu1t794AaABAg.A8-2KqOvJOLA85yaQ6dlWJ","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]