Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It will never be conscious. Self awareness is not a logic problem. Consciousness isn't just a mere quality of calculations, it's a result of many constant ongoing _physical_ phenomenon. It is not the result of a really really good neural calculator, it is the result of these poor neural calculators being one in the same part of a grand system, with all of the constant dynamic physical influences and experiences that come with it. It's birth, it's life, it's growing and eating and air pressure and hydration and chemicals etc. We aren't electric machines in a skull, our wetware brains are our body, which is constantly dynamically influenced by the physical world that it interracts with. It's both in-tune and struggling with the chaos it exists in. While my takeaway may be philisiphical, how biology interracts with physics isn't. Concsciousness isn't just reflecting on thought for thoughts sake, it's reflecting on the whole animal experience. Which you can't simulate. Conciousness is a process. It's not logic, it's logic + chaos. We are children of a chaotic system, and we are fascinated by this. All living things can physically feel this because their bodies/self are chaotic systems. And we yearn to understand it and why we exist in it. And we bang our logical calculator brains against it. That's what makes us conscious. Dumb animals looking at a mirror is a sign of this same existential phenomon. Humans just have the neural capacity to consider the consequence of this self curiosity. A robot will never feel these things because its brain will always exist in a defined space. It will never be a natural presence in chaos, and therefore never sense a self-presence. Even if it simulates being a human, deems to be one, and seems identical to one, it is inherently a mental vacuum in a black box.
youtube AI Moral Status 2023-08-22T18:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxkAq7P-KDLH3NONvp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx35Zjl757FTvcfygd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwBsQkXDQzmbDY5gNF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzzp4bBOOpvlksJTIJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwKuc1HKJ56q8hh7mB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyWlyJ0W8u7iS7LuzV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxQUQiifJwBnWeWmvh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwbmJOsAxNLeWiWGcx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyFlAbiwVBVndoZpTh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzNh1mh2CXn4t2tr0N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"} ]