Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I believe that the time to determine whether AI robots have things like consciousness and rights is now, before they become self aware. I believe that the single characteristic separating humans with computers is that HUMANS ARE BIOLOGICAL ENTITIES, MACHINES ARE NOT. So in answer to this question, robots do not "deserve" rights anymore than a vacuum cleaner "deserves" rights. In other words, if something can be manufactured in a factory from man made parts then "given" the aspects of human beings, they still DO NOT have what we commonly refer to as a 'conscious,' nor are they heir to to these same human qualities. I also believe that if any entity can have it's entire memory instantly deleted, it is further proof that it is not an actually biological in nature. In strictly religious/spiritual terms, robots have no soul or karma, and in fact have never existed before. In the end, I think we should refer to the novel written in the 1800s by a teenaged girl by the name of Mary Wollencraft Shelley. The name of the book was, "Frankenstein."
youtube AI Moral Status 2025-08-12T18:5… ♥ 2
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugyy1rihbMVh5Gm1lht4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzuKT3-UPboTK0ttEl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz3QECLseQAJU31HOF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzNw45KB6TM7ZfAJml4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwhKyNDzzKSWqeO0VB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwGWaaI5CMpBNNRkkF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxqtY1nBXu5nSOjwdR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz6Qj55YajPtfGOE6B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyFFFSKz5cvsVZYpth4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy10WI7WkkILfjEGEp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"} ]