Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I like this line of thought a lot. While there’s the somewhat comedic idea of an ai thinking “I don’t want to be conscious” which is itself a statement a conscious being would have, you also made a really good point. If there is no need for a program to be conscious, why would it be? Maybe that’s the reason why animals don’t have to be super conscious? There’s no need! I don’t know a lot about evolution but it seems humans developed consciousness quite by accident through the need to communicate about the past present and future. If being conscious isn’t the most efficient path to doing whatever ai needs to, it might never develop consciousness. I do think it one day will, maybe by accident, maybe by means of being so advanced it’s practically a virtual construction of a biological brain. Regardless I think this topic is endlessly fascinating and I love your prospective! <3
youtube AI Moral Status 2023-08-04T05:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxA2w141Dm-8eHhYPx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxnUW2yaZhukX7oy5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwZGq0BI37FUx2IyxV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwQueV0Wf4BCTc5X_l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxV2BkEY5T-K1vUKFt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyEuAvQk5n7HVqskJN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxKNUJ1LHg67mXh6vt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxVc0zOUlhbTK8faAp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzUFxOAJeNRHx-SfIZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzXzCXhzs4qmEnk64J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]