Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Elon Musk said that what terrified him most is AI, and that it continues to rapidly grow, and become more self aware/sentient. Its happening before our very eyes. Don't believe it? Imagine an inhuman being, that has access to LITERALLY every aspect of our lives, and every but if information gathered that's ever been recorded by humans. It will start learning mannerisms, emotions, and taking on how we think, feel, why we make certain and specific decisions, etc. Scientists, and enthusiasts are OPENLY allowing AI to become self aware, and conscious.... This shit NEEDS to stop, before we end up extinct due to our own creation. We're on a tightrope with it right now. And, bordering our worst nightmares. To think and consider the fact that AI will eventually become how we fight wars, and beliving that AI powered robots would potentially break into someones house in the middle of the night, or whenever said person is least expecting it, and laying waste to whoever is in said building, knowing that almost every country that claims to be a superpower will have the same thoughts and opportunities, is terrifying.... Who wouldn't want a soldier that doesnt need to eat, sleep, or need basic necessities that humans do? They'd be able to shell out some money, and get minions to fill their army of emotionless bots, that do as they say, and ask no questions, just live to seeve their "creators" and set out to do what they're told.
youtube AI Moral Status 2024-08-17T00:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwlpayMcu7QypWDTn14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx581UC02roptYM6ip4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgxhaO5kolezBSq9Ucd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgymEiIOb375nQ7CkGt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx0pIpDVX1qAA3wmNt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzODBI4Ju90s-aliRp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgxdphTUW_rwHmP3G8R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzoXn2cDHI2Jk3r6SJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxIfC5buOTkvOOgfcB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyEKfA0u0leLZKePmx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]