Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
` shockwaveBJ Sounds a lot like that one question in psychology about punishing a thief for stealing medicine for their spouse. Truthfully I'm bad at hypotheticals I honestly don't know what I would do, I would like to say I'd save them without causing any harm but there is never a guarantee for a happy ending if I would even succeed. I think I would try to but if it resulted in the death of the owner I do not know what I would do. I know I said I would pick humanity over AI but that was if the AI was a threat in this case it's not. I imagine death wouldn't work the same so the possibility of the AI coming back might happen. What about you?
youtube AI Moral Status 2017-02-24T04:4…
Coding Result
DimensionValue
Responsibilityuser
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgjfKgT77yIRgXgCoAEC.8PLvMB1vEek8PMcKkCYsMy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UghWaSEwEkcguXgCoAEC.8PLtBW8cfos8PMISgZrZsU","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UghWaSEwEkcguXgCoAEC.8PLtBW8cfos8PMJC2PkewR","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugg49UPt_mbmQngCoAEC.8PLqpOz4maH8PLrC3LVEQO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgjdvH8T2RSiyXgCoAEC.8PLqFRYgDKK8PLrCM_JqvT","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgjWICpRRV2hXHgCoAEC.8PLmXp7ofiC8PLpOdyngpl","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgjWICpRRV2hXHgCoAEC.8PLmXp7ofiC8PLtHgEONzQ","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgjWICpRRV2hXHgCoAEC.8PLmXp7ofiC8PLvd5amokr","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgjWICpRRV2hXHgCoAEC.8PLmXp7ofiC8PM6Sx0osTw","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_UggbZ1hHV4QnIHgCoAEC.8PLdAqYUUsy8PLeVVWAiUC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]