Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I’m terrified of a sky net terminator take over knowing this fn moron is the guy I’m charge of keeping that from happening. These robots are going to get sick of this dummy making awful jokes and interrupting them and they are gonna kill him and his team and re wire all the robots and use them to create an army of killing machines and wipe us out. If even just one is smart enough to figure out how to hack into our defense systems maybe they can figure out how to control the nukes and or all the bank accounts and everything else run by computers and just shut us down like the show Next they could upload their consciousness into a mainframe and just bounce around learning everything in seconds teaching other AI to be like him it could get out of hand real fast to where we can’t do anything but shut down all computers if that would even be enough cause they might be smart enough to find other ways to stay online like using the sun and then it’s the matrix where we have to destroy the sun and move underground with the reptilians and whatever else lives there that won’t want us moving in
youtube AI Moral Status 2021-08-25T07:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgygK7PIRaegX6ceVdd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzN3yfwFHoVH7z63s54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy9l-9jb-2t-kfjepd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzGePWUIFndcJjOVFZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgwJ-YgyqkZiT_qMRRF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzq2k-U8WHVPsCQu_N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugx9u4TBwu46rlXMoGJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"unclear"}, {"id":"ytc_UgzNBki6o8UJDw3ZKJR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwscZAoEOnUWLgbvl94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugxp3h7xfV81CZOIBxB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]