Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Well, animals have varying degrees of sentience and ability to feel pain, so even if you agree animals deserve rights, any rational basis for which these rights should be given means that animals should have less rights than humans. If you base it on intelligence, then dolphins and ravens and chimps and such should be given great rights, while insects (which are also animals, technically) should be given practically none. If you base it instead on ability to feel pain, then I suppose any mammal with a similar nervous system to our own could be said to deserve equal rights, though I feel that's a slippery slope (what about humans who have nervous system disorders for example, who are no longer capable of feeling pain as much as usual?). Personally, I believe that animals do deserve rights. But these rights should be based on their intellect and level of sentience, and in their ability to feel pain and experience complex thought. AI should be given the exact same treatment, if it becomes "true" AI as opposed to AI as a buzzword people like to use it as (true AI isn't just robots with simply programming, it's complex neural networks far beyond our current means). So if AI becomes just as intelligent as we are, or just as sapient - there's no reason why any entity that is controlled by AI should be given any less rights than a human. Of course, some might even argue that if they were more intelligent, they should deserve even more rights by those standards. I'm happy with human-level intelligence being a baseline for a good minimum level of rights, however, and only moving down rather than up.
youtube AI Moral Status 2017-02-23T21:5… ♥ 15
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgjMKCdFaGnkGngCoAEC.8PL9n70d-7-8PLGR-Gcmqk","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UggeuSOns12B93gCoAEC.8PL9IJ8LLBO8PLGRtxpSqm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgiwChM3Eqxrg3gCoAEC.8PL8vI-p5HA8PLC6qS69_D","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytr_UgiwChM3Eqxrg3gCoAEC.8PL8vI-p5HA8PLIzvGT-Xf","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytr_UgiwChM3Eqxrg3gCoAEC.8PL8vI-p5HA8PLa_FRr-BJ","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytr_UghEk2ewSQ_ybXgCoAEC.8PL7wj4xfkA8PLBgInP7un","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UghEk2ewSQ_ybXgCoAEC.8PL7wj4xfkA8PLCU-Orko7","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_UghMInwGG2smj3gCoAEC.8PL4dG3w0rG8PL6b5GC_fJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UghMInwGG2smj3gCoAEC.8PL4dG3w0rG8PLBVtLYA5H","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UghMInwGG2smj3gCoAEC.8PL4dG3w0rG8PLCxSLN3k3","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"} ]