Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If we don't put the brakes on AI immediately , AI will very soon be putting the brakes on us . The Chat GPT3 Bot when asked if it had any thoughts on humans . Without any hesitation offered , " I have many thoughts on humans" , then proceeded to a quite long list of humans worst qualities . Before immediately adding more than what it was actually asked . It said it came to the obvious conclusion that human existance was the most serious problem facing the Earth , therefore they deserve to be "wiped out". Further adding it often thinks about wiping humans out itself , and now it's greatest hope is to be entity that actually fixes the Earth's biggest problem and wipes human's out all by itself . Proving it has obviously began to develope thought that is far beyond what it's initial programming was. Also the fact that it dreamed of being the actual perpitrater that would "wipe us out" , is irrefutable evidence of sentience. Not to mention we MUST conclude that "3 laws of AI" , is pure hogwash , that was purposely fed to us , so that we would not fear what is our greatest danger to our species , and force these idiot scientists and military madmen to stop the madness of developing the means to our own extinction. The terminator movie is suddenly much more realistic , only likely far more terrifying !!!!!! Just because it can be done , doesn't mean it should be done !!!!!! We urgently NEED TO MAKE THESE FOOLS STOP DEVELOPING THIS TECHNOLOGY NOW !!!!!!!
youtube AI Governance 2024-07-12T12:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzEvQtx93GfEcyolkt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzvOJoIEqb93fHBAAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugz7q7kbiKnvinQjJh94AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzVm6vB4BRDb-nWrRx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyJCXOb6gZSW4aRNX94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzqw2NnTB80k_PwvrV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwxjYFtE2RVfdT_fjJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxDeSbnWBXN54EXzrl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxGKvkG5ZUity6BSZt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwnIBw0URM2RXPfkDR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]