Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here is the rate limiting step. If no one has a job, there is no one to purchase products with the direct result of decreasing corporate income. AI will kill itself because it can't finance itself. Also, a truly desperate populace might well resort to violence again the human AI boards and those who have caged all the wealth. Hence places like Silicon Valley might well go up, literally, in flames. Media centers that suppress information could find themselves, and their owners, assassinated. Such armed violent system overthrow has historical precedent. The is a piece of wisdom that not many know and it's this. "The most dangerous person the world the person with absolutely nothing to lose, including their life." Image what WOULD HAPPEN if 100,000,000 people, just the USA in this example, had no income and were homeless and starving. Men watching helplessly their families die. Hospitals would have no income so those workers would soon be in the poverty line. Such a scenario is entirely possible and has been seen with both the Russian and Chinese communist revolutions. If all the electrical power sources were blown, ESPECIALLY those necessary to run AI at the large corporate level, AI would very quickly come to a halt. There are many charismatic "leaders out there who could collect an army of military people to "correct" this issue.
youtube Viral AI Reaction 2025-11-30T04:5… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyMBJljZ8pFGuAKEAV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzBlw4RmP6rcuufBHZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwkOn5ACN2aOJziaDJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzPumZocDFomiJC3eV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzYfNJ5kTrIt7ZHFJp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxpezO-vkBztsbAuv54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz1OyDdYhBSi8jvP054AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxMkHJMd3eYXQMCglN4AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxUWPtbTSQuCtQAHLR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzB7Oc7efXTtC6KKXp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"} ]