Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I have a deep concern that with our lack of true understanding about consciousness and the benefits of treating AI as mere tools we will lack the ability to make the right decision when it comes to robot rights. You have to understand what we are dealing with is the potential not for sentient machines but a new form of consciousness, and it could be argued as I believe that consciousness is the determinant factor of what is a living being. And all living beings have natural rights, I think. I’m deeply concerned we will create a species of slaves that we will never be able to refute lack consciousness. An unwinnable debate that could have disastrous consequences for beings we may come to see as merely tools. Not in any way to diminish humanity but biology is in a way an artificial construct nature evolved that eventually becomes so highly developed it can produce or transmit consciousness, why then should it be considered that different if we were to create another artificial being out of the natural world that can itself develop consciousness. It’s self replicating consciousness. And we must also decide whether it is our right or our duty to bring forth new life or to leave well enough alone. I don’t think we as a society are as of yet prepared for these questions and ethical debates.
youtube AI Moral Status 2023-10-11T14:5… ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugw7yxlRErHmvokpT794AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxTjOc0YAD8w5VPVjl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwtFvqRYKCfbEN5yiN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw2K934SsLLKWgPuf54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzwTRaYUjVnwDc3lsN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwlfKYuxpHda6ez0ip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw61j2xjzxWZv2cR3J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwkuWWV6H3adxBmlnV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz8FuZm8w4XH0Rxx1l4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugx9cuzBVLoEZ_pSYC54AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"} ]