Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
people do not think enough about what AI requires to exist. if AI is going to take over humanity, not only does it need to be super intelligent, its going to need REALLY good robots. humanoid robots that can run power plants, repair data centers, repair leaky roofs and maintain server racks. so until we have Super AI, a robot army that could defeat humanity, AND a robot army that could maintain power plants, servers, create all the delicate hardware needed for AI to exist like chips and wafers and maintain those factories, power plants, and repair and rebuild them when needed (imagine robots breaking ground on a new power plant and actually building a functioning one, they can barely pick up and pour orange juice)... AND mine all the raw materials the ore and ship it around the world and refine it and build everything - including maintaining facilities to build more robots and service them, we don't have a whole lot to worry about. think about it. I'm sorry but i hate to say a lot of these smart people have not really thought this through. we are not creating a phantasm, we are creating a delicate computer program with insanely precise requirements for existence. a war with humans would be short and brutal, and could end with a bucket of water being tossed on a server rack. humans do not require much to exist, are extremely resilient and resourceful, and only need food and water. The asymmetric warfare we count bring to some tin cans and it's supporting infrastructure would be fucking gnarly.
youtube AI Moral Status 2025-11-01T08:2… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzV2V0G7yDp1WKgOBp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy8iSTyp9NyAVDi8ft4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxTM3_p9Zs990HgOTt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzhHjhvvE2BcdADgod4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy9T-w0Clu46j3hNnB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxTGxgEQnozxdvv0Kp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugzxn4xso09goJWUAI54AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgwptZLUKuh6knkJaW54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw928RQgF47WVOLCOd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwgcqhLlRnka9l9Ia14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]