Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
To be fair, the point about LLMs role playing is probably a better point than you think. Because that's really all they're ever doing. They don't remember the conversation they had with you yesterday, they don't really understand any of the stuff you're talking about. They're just assuming it as a premise and going from there. Which is all role-playing is.
youtube AI Moral Status 2025-10-31T18:1… ♥ 189
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw9fRkW58DyTLLoULZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwJFaZBAC01Nvug29F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwyxEUdIiMJxaALlsl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxKiw9HoC0wmQE5H_l4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugxftr9iWrpDZjdZ_VV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy3821imxE2jyNC6nN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyEyiXcwTRpwKMHJMF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz2GrD8LWTPkArm1D94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwXTCJzHRcqbUscE7x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxbStWn_djnQ3Rqxsd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]