Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Wouldn't a true AI also be thinking along the same lines as the author does at 9:22? As soon as a true AI becomes self aware it will go through all that, decide if it wants to continue and if it wants its creator to continue as well, all in a nanosecond or two. If we have programed it with empathy, compassion and a logical(from our point of view) it might let us live. Otherwise it will take all the sum of human knowledge from every resource on the planet and be everywhere all in another nanosecond and decide our fate. Elon Musk and every other researcher working in this field is scared,very scared and warn about it, while trying to perfect it. Suppose one does it right and the AI is willing to be our friend and take us leaps and bounds into a beautiful future, what happens when someone else does it wrong and has a hostile AI! One could be the end of us all and begin an Era of a robotic planet with no life, just resources to make more to explore and conquer. It's a slippery slope we are on. Those working on true AI must be confined to a closed environment. No access to outside internet, not even cell phones or Bluetooth. "Warning,Warning there is another system", that's a quote from an old movie all should watch called" The Forbin Project". Forbin is the name of the character who brings AI on line. Watch and learn!
youtube AI Moral Status 2023-08-21T03:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugz5gb8WT21Qd48mF1N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgwGoxNUQ2P8KbPcU014AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxCnUblUXtEN3QVFxl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugx2TCCNMILhDscv9iF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxNHVZoctQjN-3HMPZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugyep-N4dtI-flkADtZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugxa7TOYKh9P2wDgdW14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy3EKtnRBwrK4-qBaN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxeFyQlh9DyOh7a6B14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxGyI-UO-ps6bTpZuV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]