Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Golden Garuda I think it’s one of those things were we make ai so smart that they’d become self aware themselves if we limit how smart they can become it limits us as we want ai to be as smart as possible for our benefit like Detroit become human for example we see that they made them to obey and be semi self aware but not really but they were so smart they did it themselves Marcus in that game was treated like a human so he showed more human emotions but still was not self aware completely but became self aware given the chance if they can have emotions there self aware to me
youtube AI Moral Status 2020-02-10T01:4… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyindustry_self
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxwpEk9vKe9NUA-ZQp4AaABAg.904DGqcR7LL94bR4GSfotO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwS-unypey-mQW8Tfh4AaABAg.9-O9fftlkQF94q4xyHR9h_","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_UgzGEUJLGlkIL-s04Tt4AaABAg.8zUjiWrAmFM96sYJsdnVXU","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578zTmfp8-h83","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578za4jR2mq-X","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578za9LT-CDfV","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578zcxRScYP3P","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgxlLv0rMuNplN-nVHR4AaABAg.8zT_Xegkn6U8zTnEN94W_S","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxXyh-p6pDPGItPDER4AaABAg.8zRQsIFYrAU8zTwXqfTc5K","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytr_Ugy4JPgBw-FkT83RQRp4AaABAg.8z1OytC5V9C8z8PSKOgGak","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]