Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Interesting! Expanding on the apology bit as someone with little-no empathy or remorse. I experience cognitive empathy which is logically knowing why someone feels a certain way vs affective empathy (understanding someone’s feelings through your own). In common terms you could say I experience sympathy and very little empathy if any at all. I still have the ability to give a genuine apology. Even if I can’t/rarely experience remorse, guilt, or shame, I can understand right from wrong. I can understand why someone feels the way they do from context clues and cause/effect. If I do something wrong and hurt someone, I know that it’s bad, and I apologize. In the same way that you don’t have to give a lick of a shit about some stranger in order to show them basic decency. Obviously I’m human and still experience emotions. But when I was younger I grappled with the idea of the only difference between me and an ai is what we’re made of. Now that I’m older, I know that the difference between me and an ai is that I have thoughts and experiences at all times. I would feel and grow, even in a vacuum. If you leave AI in a vacuum, nothing will change the next time it’s spoken to, because it doesn’t really exist until you speak to it. I think AI has the potential to grow close to consciousness. Not really any base to this theory. Just the thought that if you can create consciousness biologically, there should be a way to create it mechanically. Consciousness in different materials. If I was a brain in a completely metal body, I would still be human. Bet then that leads to more questions for me like what’s the difference between human consciousness and a bug’s. Do they have feelings? Do ants feel angry or are they more like biological machines running on nothing but instinct? Are there different types of consciousness? Could an AI’s feelings be more complicated than an ants potentially? Replicating human consciousness is not something we want. But it’s something that could be possible. At that point we wouldn’t be playing god, we’d be playing evolution.
youtube AI Moral Status 2024-09-21T08:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugyioy-WQfaV9DYBKaZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwCe4gkvIfIC5SZOi54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxjX5RR4M79bKmA4Pl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgyfJIuIwdUOnrgagUJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz9w47NyO42hDXeNG94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwshfp7nyLWQRLMmPd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzn45qF25xVBkEBm_N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxxCNGnGu9yic_Md9h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzDNaeTBbnEEdip-GN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw6VLyYrij5T4IR-yx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"} ]