Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Everyday humans can be so dumb. I swear we are trying to eliminate ourselves. This is why I say movies or real life. We're about to sit back and recreate. Terminator skynet. These same computers are going to outsmart us and keep evolving until they realize that humans are the problem and need to be eliminated. For some reason we think we're so smart, but the whole point of AI is that it evolves it learns so it will outsmart us and then it will eliminate us. I was never for AI and I can't figure out why humans have become so freaking lazy that we need everybody to do everything for us and men have gotten to a point to where they are rather buy a woman that they can control and do whatever they want sexually then the deal with a real woman. Accepting things like this is just another level of accepting the beast. I got some point whether they do it. I can't control it but you have to realize in your heart and mind that this is wrong. The devil is always trying to be God and recreate his creation. And lazy humans are going to allow it to happen because for some reason we're never satisfied with what we have and don't want to work for nothing.
youtube AI Moral Status 2023-09-20T21:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugzvf8j0L11NJ2J-RLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwueIkwOAmQv6qGeSx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx2-t8O9mEspvYII_N4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwXeRp4BDYyMV0_ePl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_Ugz638dqEYW7usGzUkZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxftIG0HBe38RZ3qw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw3u2oVGDjWmLz-Atd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzuEOiZ0_EjJ2tRnz54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxd9_mihnWI4yGojK14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz5Q_GT7z5zQX746j54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]