Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here's what I think. Both Alex and his ChatGPT instance flattened this argument to one of logic. There's a missing element in this argument that breaks the whole thing open. And that is: emotions matter. This is what scientists, materialists, and our culture at large still fail to recognize on a regular basis. Objective logic only gets us part of the way. It's only half the picture. If we take his statement about disappointing his wife because he chose to donate to the charity instead of taking her out to dinner for their anniversary a bit more seriously and speculate a bit about what might happen if he continues to act in ways only according to these so-called "moral obligations" even if only when the traditional and expected occasions where one might treat their significant other with some sort of gift or extra display of affection: birthdays go by with no gifts, no cards; anniversaries are not celebrated with dinners or other splurges; in no way does our subject treat their spouse with something extra, because there is always someone else more needing, more "deserving" of those resources. One might reasonably guess that our subject would find himself without a spouse before long--and maybe this causes either him or his spouse or both of them to spiral into a deep depression, maybe even suicide. Now he's perhaps saved some child from contracting malaria and ultimately traded his own life or the life of his partner for that outcome. When viewed in that light, I believe it shifts the "moral obligation" a bit, wouldn't you say? And so I think it is a grave mistake to consider only logic based in objectivity when weighing one's responsibilities or obligations. In fact I think this subjectivity and emotional consideration muddies the water enough, that we can rarely deem ourselves able to judge what another person is obligated to do.
youtube 2025-10-20T17:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugxv4aE8Ope9ZHiIyq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxPLErCIEKUGChrmu94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxaK3jwaLxTG0T8LbF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzrDS68UTnqnIkNtzZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwmH8PeWZ-iineKNHR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxrkXL01ahVDkEXWn94AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxv5VO3YTfBnSGNAfx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyAORyHaXxqlF-mWnZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxJTmksDTaMFoS5mWp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwYm8RocxVTilulWCZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"amusement"} ]