Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The AI hysteria is just like any other other. Fear fed by those with a vested interest or an agenda: Those that want to sell it, governments that want control, media that want attention, people that want something to aim their fear/anger at. We don't know what human consciousness or sentience is. Why would we think we have any idea if AI has similar things? If we say they have 'different' sentience, what can that possibly mean when we don't understand our own? What is the point of making that observation. An apple has 'different' sentience. So? It's interesting academic conversation, sure, but it's not worth spending (wasting) any money on or worrying about. We have much more important and pressing existential things to worry about. The only reason AI has any importance is because it _can_ be used to do worthwhile things. The main reason it has importance has come to be, unfortunately, that it is being so horribly hyped and mis-used. I can't comprehend the amount of effort, time, money and resources that must be currently being wasted. The sheer amount of computer chips and the associated power and cooling required is what we should be concerned about, not hand-ringing about whether or not we might be being mean to possibly differently sentient 'beings'. What we should really be trying to understand about AI is how we stop the hysteria and the hype. How we stop it's misuse ruining businesses and becoming a damaging media phenomenon. Good gods I hope the AI bubble bursts before we waste too much more time on it. We have more important things to worry about than the overblown tool we have come to call "AI".
youtube AI Moral Status 2025-08-15T11:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugw_WsuXF_bXQ8mF3Cd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwmPd8JdmAi5ZdvS4h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzmON0wOlrj--jtBBJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwdPzzqKqdX6L5y7qB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyyid_ghsQv4PoPcc54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxFM0NwirVXvgBR7cl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwIV4F31q63ip-MhcZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyhiAiSOXcf_ecu3S14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzDQOZwoLmKpjEXfM14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz8pUcJizUr0pdQxh94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"} ]