Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So earth and humans coexist as life , there is a harmony, then there is engineering and scientists , they seem to work to dehumanize and destroy the earth even to make their machines , so robots in a moderate degree may not destroy the planet but using them to make machines and take humans place for engineers to design their own utopia is in itself evil and dangerous. Engineers and scientists seem to have a sick delusion of justice. Until the whole world is enslaved to work on their utopia and indulgence of destruction and fornicating with creation . The illusion is in the value of their machines if they are destroying the planet tk make machines and products that are a luxury including computers , lying about sustainability and sustainable practices , culture and the economy. Robots may seem cool but in reality every machine takes a little away from life and earth just to make it. If you cut down a tree another will grow back , if you take the oil out of the earth for a machine it can’t be replaced , if you take coal or metals from the earth they won’t be replaced , isn’t it strange how you can’t make a robot from living matter but humans are alive, so is something dead more important than life or something not life more important than life and if a human is in dominion over rock or metal than how not a robot or machine , machines are slaves because they are not life, is it greed that any machine exists , could be debated.
youtube AI Moral Status 2023-02-13T05:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgxIP-6TRMu7VcGkJY14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw9vxOiCz2X0N1kmwh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzXLMfAiUn-KPX4K2x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgwIevhAM0tctSTginl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgzgmnL-ytc-MzbVQQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugyp8M2hLDYryw07TFp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzgYZJ0NB_IJvmblaB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzVP5ykVByPxRH66GV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_UgziXIm57qgqjH_wtYV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugy7D7ymZMeBMIX-h7x4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]