Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Wednesday Addams and Enid Sinclaire don't want people saying that they're dating on social media - they say it invaded their privacy, and makes them uncomfortable. Performers have this thing where they embrace the wealth and fame of what they do; but they reject the parasocial attachments that they make with their fans. That's show business. You don't have fans without the parasocial element. So why is this relevant? If performers want privacy, and fans want to feel like they are somehow involved in the lives of their idols, then the answer is very simple: replace the performers with artificial intelligence. People could create their own content, and no one's privacy would be interrupted. I've also thought about a.i. ads on the internet: what if all of our ads were designed for us personally? Instead of seeing some politically or racially charged propaganda; you could see something that truly related to who you were as a person (I think that would increase sells - when I see things I don't like in an ad, I refuse to buy the product). Finally we could replace politicians. If anything could and should be replaced, it is the human element within the government. Edit: Nobody cared at all whenever you guys poured thousands and thousands of immigrants into the U.S.A. to work for bottom dollar. You replaced us in the workforce: don't think anybody is going to cry for you losing your job now to a damned computer.
youtube AI Jobs 2025-10-08T10:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyD1BOX0xZnmHMDQ9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugzm2B7uq2ISs6vqMZJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzI7HOkdi_EABHLULd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzKxxt4dHDAKYwF3ah4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzTELtWBJ6_eaiUKZ94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxmNby9aXUETLrhzXx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgyAEnMoqCJa2YpLbLR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxVXQc0pkShKDkP2Nt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyaLmo0NTzz0KPhqn14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwI8ecv7rUkslurdGB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]