Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
38:35 BIG AGREE Even if people reduce the way of thinking of them as "it's just computers" I think it's extreeeeeeemely bad if we functionally end up making digital slavery a popular normalized thing, and I think the only possibly prevention for that is to assume the worst case scenario ahead of time and don't behave in a way where that *may* be the case. But also even figuring out where to draw the line for that vs all the ways we already use computers that we've already accepted don't have thoughts/feelings/experiences is technical enough that trying to convey that idea to the average person might be extremely difficult, because we don't want people to get to the point where they're afraid of their phone battery dying or of unplugging their PC in the same way they're terrified at somebody murdering a pet dog or cat. And of course because quite a *lot* of people don't even want to have the pain of trying to work through that kind of possibility as a reality, plus the current trend of big tech over-promising with AI, under-delivering, and constantly lying to the average person and business about what exactly they're getting, there's not enough goodwill for people to really be taking people on their word with *any* of that. -- 51:25 Remember Tay? Even without guidance to be anti "woke", that one got pretty nasty pretty darn fast.
youtube AI Moral Status 2025-10-30T21:1… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyABG2BqQo_bQ0RTeF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzwKIBkTIjwF5QgSOR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzp0VQ5QCWvMSJH6-h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwXt8u0LAlcm6JcuIJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx2mNarWuP2T8jCTfJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxCJBabiQ3Iz1EJtSp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzwP2sI4oMWXokqcHV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzfXKjmHwOdcVoYIAd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxNQQH7JScRsLDbMUp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwnUXuXIdWgn0uB8bd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]