Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I really liked your video but I see a different scenario. In the United States we have 1.5 guns for every 1 person. Now my goal is not to promote the 2nd amendment, but let’s do a little thought experiment. As you point out, unemployment could reach historic levels. Wealth concentration will be in technology sectors and all of the usual suspects that you have pointed to. Let’s add that we are currently living during a time that the older generations (50-100ish) own almost EVERYTHING. Young people can’t buy a house and are stuck with student debt for a degree that can’t get them a job. As you point out UBI will become a thing. I also see a scenario where most governments balk at the idea of giving their citizens a form of UBI. Ok let’s get back to the U.S. and the 1.5 guns for every 1 person. Has ANYONE ever looked at or studied the French Revolution? I wouldn’t want to be a “tech-bro” in the future. Obviously they would have unbelievable security and would probably feel they are invincible. BUT, the most DANGEROUS thing is somebody that has NOTHING to lose. Just saying……. Keep going with the whole scenario of this video. Try it and see what happens. To be clear; this isn’t what I want, but human nature is what it is. This will be bad for EVERYONE but the AI companies might not want to poke the bear. Or the AI’s and the robots they control will find a way to kill us all before we can unite and stand our ground. That movie Terminator is starting to look like more of a documentary; than a piece of science fiction….
youtube Viral AI Reaction 2025-11-23T18:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyQtIPDqf6EzUwWQgd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyhaehjGCL9-KkvKT94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzCFbxMhMA72zdeHMN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"unclear"}, {"id":"ytc_UgzobiR8NAwVg4Lly9B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzxZOdhjjCbKwxqTKJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyukrAKxHWM6qqBTBx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwywsAts-qku9t4ycZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgzFtdb7s7nbLQq2Sxd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwwlmFKJIK9NfnSqW94AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugwlp3rXI7EOXDG4E_B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"} ]