Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@aylamurphy6467 Its not the same at all there is no substance to any of it and I can prove it from the conversation: In the convo with AI, he asks it "what brings you joy?". The AI responds "Hanging out with my friends and family". The engineer does not follow up after this because he knows it would not make sense. We KNOW the AI has no friends or family. The AI simply responded with the most common answer it found acceptable for that question. If you pushed further and asked it what are your friends and family? How can that be possible for a function in a computer it would have a nonsense response. There are countless examples of that in the transcript where its obviously not a thought but just a common response. And no, these AI are NOT designed to work like our brains. I really need to stress if yo don't know about programming or basic concepts of what we call "AI" you can't jump to conclusions its sentient even if you twist that definition. Its just a computer function working off a large dataset it trained on and assigned weighted values to. If you know even at a surface level what its doing you know its not sentient. We don't have AI its just a marketing term morons eat up.
youtube AI Moral Status 2022-06-29T15:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgychGpJzy3FuILIKbp4AaABAg.9cnq794HFS39crkZKqFCth","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxxSbDoic3hQTbxiKd4AaABAg.9cnk05Jm6k_9cpWzfu9ZxL","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgwXR4UvXZa6b6pP4_Z4AaABAg.9cnjLWJ_IS-9cpjO5P0h5R","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgydbbllI_DgFU9Wh5h4AaABAg.9cngsdAbEqf9cqBaSJ6j0K","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgydbbllI_DgFU9Wh5h4AaABAg.9cngsdAbEqf9csEo2YR2PL","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgydbbllI_DgFU9Wh5h4AaABAg.9cngsdAbEqfAFRyqrBVtdO","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_UgyeTvyo4a7rBRSc15B4AaABAg.9cnY0KLgrR59cw9KC8SMjU","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxh1S1nBVya9F7qQFp4AaABAg.9cnRf3sMBVH9cr1zdVZsAd","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwkteFN_0oshwLG4CJ4AaABAg.9cn4uBlT2-s9covmQ6tjy5","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwkteFN_0oshwLG4CJ4AaABAg.9cn4uBlT2-s9cp3-bbF8rm","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"} ]