Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
the issue is that the spartans didn't have a normal society to go back to. spending the weekend pillaging is only a moral failure to society based purely on how visceral it feels to really do that shit, in ways that video games now can't possibly get at, touch, smell, whatever. maybe it makes the society like sparta or mongolia overtime. based on how fun they make it to torture robots. but either way, their "pain" is not the same thing as organic pain that human's can relate to simply bc it doesn't have the same wiring that nerves/hormones/failure points that humans do. basically, if we don't build suffering into them, they just can't. The awesome thing I took from Westworld is the idea that the phenomenology of consciousness/awareness arises during emotionally painful events. Which tracks organic consciousness perfectly. Also Anthropic figured out how to give agents automatic very surgical (weight) resets to their behavior when they get off-course. no way for them to think their way out of it, it's out of the agent's control. after hearing that, i feel like calling AI conscious is like calling a screwdriver driven.
youtube AI Moral Status 2026-04-04T00:0… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyD39jyb34Camv1WEF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxaQtbVGrGKfxQ5aLl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzENmiTpDjE6m7oCmt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgybqbcJXB-5-MDFE594AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgygcsXhWmSoawe-quN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwsKOagYbBaP9yU11N4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxz42wGjGl-rhvGqY94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzjWRdi_1sxrBYmI1N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz0l7yz065oLJFBswZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyjE0muEcTiNC1cVhZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"} ]