Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We've already figured this out, pretty much. It's not great. When something achieves a level that we previously used as a definition of "what it is to be human"... we move the goalposts. It doesn't matter if it's an animal, or an AI or a robot. We just move the goalposts, in the most hypocritical ways possible. An AI passed the Turing Test. Oh, now that's an illegitimate test. This is not new. And history has things to teach us. Ever hear the story of John Henry? The guy who disputed that a machine could do more work than a human? Doing hard work, creating new things, was what he, and many others, believed to be the fundamental human aspect. And what was he willing to do to defend this? Die. This is not a minor issue for human beings. They are willing to die and kill to defend their identity as human beings and the idea that human beings are fundamentally special. This is bad. The combination of these two things... where does that lead us when machines get better and better? When every aspect of humanity gets progressively stripped away? Well, what aspects of humanity are we going to never put into a machine? Think about that for a bit. Once we've put everything we want more of into machines... the only thing left for humans will be the worst of us. Hate. Genocide. Suicide. And we will kill to prove it.
youtube 2016-09-10T02:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgijOXwzX5ll4HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugi_L9Ps1Ao3wngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgjE_qt3DXc4AXgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugg7AdD3sDYLcHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgicXkrK5at_b3gCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"ytc_UgiQFXdWgMS6SXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UghbylCg24GyCXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjxB2hYHk0ringCoAEC","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UggLyqhud7inwngCoAEC","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgjsMxoDrjmOY3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]