Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm not scared of AI, they are already used today in the form of Deep Learning and Neural Networks. The smarter ones, such as self-driving cars, use alot of processing power requiring them to use a very high-end CPU/GPU that is alot better than most consumer grade PC's. I don't think its impossible for an AI to decide to take control and not listen to instructions, but there would be a failsafe, and aspects of the AI that arn't directly controllable by the AI, for example the motor control could go through extra programmed code before it reaches the physical motor. AI robots would be expensive, because of their high end components, and also all the hardware required wouldn't also be cheap either, for example for a drone with a gun: the high-end motors to hold the gun, the extra motors to fire the gun, the cameras, all the sensors, the frame, and the processor would be expensive for a terrorist to get their hands on, and while they could afford a few, they couldn't spend thousands of dollars for a swarm of drones. A few drones with guns would be very easy to shoot down and destroy.
youtube 2019-04-22T23:3… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyindustry_self
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy3boToPB_xWnwDHgh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyh2gJc47ez_S9dRKN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyIE2I8RCmsT7k9A9Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzldFL3hAVhJj1xO9B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzXHaMIBlkxJAOtj8d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzFQY3tdogCJuB7cOR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz9HDU1etCXTMNqZPJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy5xzYdlGdJWiWktBp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwP-7b0tk7S3HzwoAB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyyVZ6vNhke3sRzYqV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]