Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@amerlad Alright then, you just have to convince every company and group of people working with AI to stop developing them. And I can tell you right now that that isn't going to happen. The only people who even have a chance of stopping that would be the government, and the government themselves use AI in areas like crime prediction. The problem is that there is no clearly defined line for when an AI becomes sentient. If we keep developing them, we will inevitably cross over that line without us knowing until it's already been done. This also brings into the question of 'what is sentience'. Different people would draw the line (if there is even a line to draw, because it's more likely a spectrum) in different places. So it's not like we can say, you can go this far but no further.
youtube AI Moral Status 2019-09-04T22:5…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxwpEk9vKe9NUA-ZQp4AaABAg.904DGqcR7LL94bR4GSfotO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwS-unypey-mQW8Tfh4AaABAg.9-O9fftlkQF94q4xyHR9h_","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_UgzGEUJLGlkIL-s04Tt4AaABAg.8zUjiWrAmFM96sYJsdnVXU","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578zTmfp8-h83","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578za4jR2mq-X","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578za9LT-CDfV","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugz52rg38UD6qhuUrCF4AaABAg.8zTbbRAPa578zcxRScYP3P","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgxlLv0rMuNplN-nVHR4AaABAg.8zT_Xegkn6U8zTnEN94W_S","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxXyh-p6pDPGItPDER4AaABAg.8zRQsIFYrAU8zTwXqfTc5K","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytr_Ugy4JPgBw-FkT83RQRp4AaABAg.8z1OytC5V9C8z8PSKOgGak","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]