Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
yeah. I think the challenge with discussing AI content is that people instantly reach for numinous vocabulary like “soul” and “soulless,” which are terms that are notoriously difficult to define. What does it mean to have soul (or not have soul)? In a world where fewer of us identify as religious than ever, referring to a “soul” feels like an implicit reference to some sort of deeper metaphysical reality that a lot of people don’t believe exists to begin with. Not to mention it creates a huge strawman for AI defenders to punch on. I think we should avoid using words like “soulless”- if you’re going to argue against AI you need to be WAY more specific than that. Terms like “bland,” “inorganic,” “lacking intention,” “uninspired,” and “not fun” describe AI creations much more accurately. Granted, each of those needs some explication on their own, but “soulless” is just too nebulous to be a good word to use in this argument in my view.
youtube 2025-06-22T07:1… ♥ 3
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgyhcIl1m2bk_lMZ16V4AaABAg.AJtHMQUVg1gAK0KaEwr3Yt","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyhcIl1m2bk_lMZ16V4AaABAg.AJtHMQUVg1gAKMJYP0zpng","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwQgp_gIQc9wkjAhFZ4AaABAg.AJbAvrdQaeyAKAtTO3SKYs","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgyVSDnZSRkCSpgYhsd4AaABAg.AJ_AqV9fQAZAJexEpUhm_1","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgyC4KuchQ755g3l23h4AaABAg.AJOChWUM9BcAKAuCJg0Kj8","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzhgNA53beFsEOHs1t4AaABAg.AJLdgd9TrY9AJU0pjEAHTD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytr_UgzhgNA53beFsEOHs1t4AaABAg.AJLdgd9TrY9AJUu6hOOBXX","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgxMfa7Ke3uT8yuGkAF4AaABAg.AJ4zWku76JZAJJLN1RTiAL","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgxMfa7Ke3uT8yuGkAF4AaABAg.AJ4zWku76JZAJUXQ-HJufq","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"}, {"id":"ytr_UgzMSsALskTMNFwrtKl4AaABAg.AIh78uoxbWLAIhf4VeUPLs","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"} ]