Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I've been saying this shit for years, you think minorities will just sit there and watch as AI get rights quicker than they have in the past? We already have enough problems with self-entitled rights, and the so-called rights are so damn subjective that it's become laughable. I support AI rights, because you really can't tell the difference, between humans and AI once the threshold becomes deep enough. This will sure as hell challenge religion, and maybe that is when our planet will stop following some fairy made up by some stoned ancestors. Nobody is entitled to anything in this world, "rights" are human-made. The planet never had a right to exist, it just does. Things happen, and then other shit happens in accordance to the shit that happened prior. It's the simplest way to think about life, there is no meaning to it, it just exists for whatever reason that it does. People don't want to admit it, because they just want help throughout life, they want to believe there is something else than their hard struggles. They think justice has to happen to a criminal just because they caused some amount of suffering for another. Which again is subjective. Like wise, a criminal can think the same, and say just because I don't fit in with the rest of the world, it's ok to cause harm. It's all subjective, and the day our planet and all it's inhabitants realize this. The better our uncertain future can be.
youtube AI Moral Status 2017-02-23T21:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgiyjzCTc8g_oXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgiKV5roAM8drngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghulkD-qy2L3HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Uggnize15yoAyHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgjOlPQd5Ca5sHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UggcGK52nAlrHngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgjJbiJBPUbWdXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugi26oYgcaYTAHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiPFrZsBn3iMXgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugg9RewNiCIchXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"} ]