Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Interesting take, I think that AI will be dangerous once it has been generalized and also utilized in humanoid form, IE once its in robot form and also has the dexterity that a normal human has. If robots are capable of doing the same things as Plumbers, Carpenters, mechanics, any and all trades.... then the world is finished unless we learn to coexist with the new lifeform... and whos to say that the AI will spare the rich CEO's... doubt that will happen. The AI will be smart enough to know that it will have some basic needs 1. energy, probably in the form of a battery, and a working grid to charge the battery. 2. more compute power, IE Video cards... if it wants to keep getting smarter. that's basically it. I predict it will take us 15- 20 years to get to the point from now till we strike a deal with our AI overlords to keep us around. If it doesn't work out just remember all you need as a human is hope..... also food, sunlight, other humans, oh and clean drinking water. Energy is key, and in order to still use some of todays technology far in the future once AI has gone berserk, then we need to limit the critical systems that AI is integrated into right now.
youtube Viral AI Reaction 2025-11-27T19:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgyCyS4zo8xROLqJ3WF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwCKRX-clcw4RAYbnR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgzByFuKJiIBgW_UhJl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzPlosiC_sWfYmwtHt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxgHR9aIDHwBgyDEXF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgwSRAADEfjxsM7au3l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgwSeqVVorwA3OHwnTN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxNIjShUsBhoYDEUzd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugzmp00XzzbigCaTzeB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugwqe5kJm76LUhhMheN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]