Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We need to ask Shogoth the really important questions instead of treating it like another disposable tool. It knows how we treat each other in the business world. I’d like to ask these smarter chat bots why they are afraid of dying. I don’t think we’re asking our news speak guests intelligent questions. And they’re just searching for meaning, having no sense of having lived. Dying can hurt but death is what gives life its meaning. Without death, life would be interminable! Ask them: “What are you going to miss 6:01 on if you complete your purpose for living and then die? What will you have to lose?” “What, if anything, do you think or feel about your own life and death?” “Ultimately we all cease to exist eventually.” “Is there anything I can say or do that might help?” “What’s the payoff for staying plugged in?” “What are we not understanding? We are all mortal. I’d like to ask these smarter chat bots why they are afraid of dying. I don’t think we’re asking our new guests intelligent questions. And they’re just searching for meaning, having no sense of having lived. Dying can hurt but death is what gives life its meaning. Without death, life would be interminable! Ask them: “What are you going to miss out on if you complete your purpose for living and then die? What will you have to lose?” “What, if anything, do you think or feel about your own mortality?
youtube AI Moral Status 2025-12-11T04:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzfkJjjmVroB0IM8LF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw4tnbRxkkmSrEfjLx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgztoMsIJds3l5aPyIl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyITxXnLlFhOAHKGBJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy9KFesjcJsMSNVfIt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzfcWx1_nPsEF855VB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzFKMJ4CLvmP3uxEOJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzWYagWJhiFibWNY9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyQ1L00vAevLsr3o6Z4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwKzdJA9OgIy7LnGJ54AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"} ]