Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I have Gemini Ultra - used for research. I asked this very same question last month Feb 2026, broken down year by year to 2030. According to Gemini, the USA will loose 100million jobs out of a population of 150M by 2030. The rich might get richer for a while, but eventually society will break down and collapse. For every dollar saved using Ai, there has to be a tax base, without it governments don't have enough to run. For every job lost, those people go on unemployment for a very small fraction of what they earned for six months, some states a year. After that, with no work, there is no way to survive. Think about supply and demand. These companies thinking they will save money but eliminates thousands of jobs - those people can't pay their mortgage, vehicles, food, or afford the luxuries that the income once provided. Yet Ai is touted as the human race's advancement, a dawn of a new era. It comes down to very basics: SUPPLY and DEMAND. When people can't afford essentials, the demand will shrink drastically. When supply warehouses sit stockpiled, the rich will just sit back waiting for orders to trickle in, their factories and warehouses running Ai idle operations. Edited to add this: From Anthropic (Dario), he stated what took a team several months to code; took his Ai software one hour. He predicts by the end of 2026 that software programmers will no longer be needed.
youtube AI Governance 2026-03-18T19:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugzk8WM6xxB5MNhuPBd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxSme7J1XVuYPGKSzt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyJeUpzaDi997Rb9YJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugw8B_otFoJBkVh_COx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyKlovs6cD9-Z3lrW94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxTtVzeeAQngRwjNTx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgyKDauE0224Q9u7JoZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzP4Hm3JzxAWekAk4x4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugwrg1_dLENOjkdDJyp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgxtF8G42jIt_VdHaLN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]