Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Surely consciousness isn’t a binary property. (Nevermind the whole sapience/sentience thing, any definition will do.) It’s easy to imagine a being that is only dimly aware of itself. Hell, just think about what it’s “like” to be in a coma. Or drunk. Or anesthetized. Or just the right amount of in between. How much consciousness is enough consciousness before morality kicks in? Do we just ban anything anywhere close to the fuzzy edges so we never have to wonder if we’ve accidentally done a slavery to our adorable little vacuum cleaners? Is there a sliding scale of AI rights where the closer to conscious you are the less humanity is allowed to exploit you? If so, shouldn’t we apply that same moral calculation to humans with less consciousness than others? Surely it’s not ok to genetically engineer a human just dumb enough to not have whatever rights you would find inconvenient. Is trying to limit the intelligence of an AI or to shackle one with the morals we developed to avoid going extinct really ok? If we somehow manage to solve the alignment problem, would it have been the most evil thing any human had ever done?
youtube AI Moral Status 2023-08-20T19:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyban
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwEXO1zKSfz7Q49yV54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"mixed"}, {"id":"ytc_UgznVTn57qj85vYIIVZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy5vMQkT2dyw40E9NB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzAcb78K6_P7ShX6DJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwaRIuWz0x38UMeiJx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzQHMDYNZpCFLZ2_gx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyUPwiEpyboy_sO0xJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyGGI5U-7KOAejzc4B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxPC5HLxsOmR1jcOYJ4AaABAg","responsibility":"media","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz0HuYIrXnkUj5tFbZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]