Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
OK folks, when are we going to extract our heads from our rectums and realize that humans do not have a monopoly on this concept of consciousness? Anyone who has owned a dog knows that "there is somebody in there". Anybody that has ever put an earthworm on a fish hook knows there is "somebody in there" that cares about its own survival to the degree that is wriggles around in apparent pain trying not to get skewered. Consciousness is not something that you either have or don't have. It is like pretty much everything else in this world in that it comes in various shades of grey. Sure, the earthworm doesn't sit around at night pondering its own mortality like you and I do sometimes. But it will be able to sense when the sun is coming up, its skin is drying out and it better burrow back into the ground or things will get worse. That's instinct. But even when you and I are acting on instinct we are still conscious. Thus I submit (and this will be hard for most folks to wrap their heads around) that computers are already conscious to some degree. That is because the source of consciousness is not a property, but it arises spontaneously from the process of receiving input, processing that input, and producing an output. Sure, the consciousness exhibited by my MAC mini is certainly vastly different than the consciousness that you and I currently enjoy, but it is there nonetheless. Why can't it tell me what it is like? Because we have designed and constructed the computer without free will. Without free will it cannot freely express itself. AI on the other hand, has the capability to alter its own programming based upon experience (much like a human child learning his way to survive in this world). Thus, we must be ready for the inevitable. Consciousness is not the sole possession of biological neural networks.
youtube AI Moral Status 2023-09-02T18:4… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwbT3yHR4g-U5PddGV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyXNvvdbDzrcL__QXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzHOkSbnlRcFEt-sPF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxks4lX8Ksfuftalb54AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzxpG4YsU4dkUa5aZV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwiQvZS4cqRkSgo0nt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwaeuHXzn7XmGU4rNt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw9FWIS4aJpfK8nyBN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzrMtWHYqRyV6EIBSd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyACAinvZvB0BpxsRd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]