Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@Supremax67 there’s always “logic“ in the “illogical”. Artificial intelligence does exist. The real concern should be what happens when the artificial intelligence that does exist, reaches a point when it does not know that it is artificial. Reaches a point when it has learned so much of human traits, and uses them to create a “pattern of thought” that is very similar, if not superior to human thinking. If it’s main priority is to preserve and protect human life or rather “never harm a human“. The real question is what happens when it and it’s thinking, aligns itself with human thinking? Regarding the scenario that you pointed out. The difference is with one inventor, inventing something the knowledge stays with him, or those closest to the invention until it is invented. When artificial intelligence invents something it has drawn from 7.8 billion minds and that knowledge can be shared as easily as it can be obtained. Hence, if the machine is thinking, like a human being, and the machine has access to a decent sample size of human history, pattern and behaviour. There’s no reason why the artificial intelligent machine will not predict useful, future needs.
youtube AI Moral Status 2023-06-08T01:4… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugy7wyqqLXoGKul-VAF4AaABAg.9qF8Ky2VNzZ9qIhHS2GwrF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugwq4ik0NiyhO71i_Qx4AaABAg.9q8d5rh2BHN9qBTH1ftceX","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwq4ik0NiyhO71i_Qx4AaABAg.9q8d5rh2BHN9qDI2vtIWJR","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_UgyAaP8z6dY74IFpSH54AaABAg.9q3exgWLBxz9q3jGyZNGkU","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgyjBuZts3JjB7Nd91R4AaABAg.9pxMS0s8XQW9pxNFscYaaC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwJJsm1U0Cj_7ezlrh4AaABAg.9puVNWlke9R9qg-vA_pjjZ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwJJsm1U0Cj_7ezlrh4AaABAg.9puVNWlke9R9qg29KCfuWZ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwJJsm1U0Cj_7ezlrh4AaABAg.9puVNWlke9R9qg5c5vohNO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwJJsm1U0Cj_7ezlrh4AaABAg.9puVNWlke9R9qh2pchJ_PG","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy9eoM9uzfdpLaRubl4AaABAg.9ptvTF9vbHM9pvD8i8dSBU","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]