Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@jonw7214 Not really too difficult to understand - that's a "tomorrow" problem. Humans (let alone politicians that wanna be re-elected and CEOs that wanna keep their jobs) are primarily going to be short-term thinkers - CEO goes to the board and said "After reading a comment from @jonw7214, I don't think we should be using AI to drive down labour costs"? Shareholders vote out said CEO and replace them with someone more AI-inclined. - Company as a whole decides to majorly hold off on AI-induced labour cutting? They are competitively disadvantaged against other companies. Because in times of "inflation" and "cost of living", general public will favour the company giving more value for money Mass job losses won't be a problem in need of resolving until it _becomes_ a problem in need of resolving. Until then, it's barely more than an "interesting discussion" that can be answered with "Nah, that'll never happen" or "I'm sure we'll work something out". People still think trade jobs are safe, despite the fact it just needs AI to control a humanoid robot. The Reform crowd (mainly working class people) care more about a handle of immigrants on boats and hotels rather than their own jobs. Where else can they hold a rally and welcome Elon Musk of all people as a speaker?
youtube 2026-01-11T10:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgzN46Vi_PEibops7CN4AaABAg.ARndGRgzittARnrUEULmei","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgzdioeSDylx6fJrZYR4AaABAg.ARncUYE4WrTARqyQ02EIZf","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugyk-MfuZaGdls-Q3qJ4AaABAg.ARn_Mu9RFghARoUkW7Jr8d","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugyk-MfuZaGdls-Q3qJ4AaABAg.ARn_Mu9RFghARpvTbA4gt5","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugyk-MfuZaGdls-Q3qJ4AaABAg.ARn_Mu9RFghARpxFtnwvbG","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgwIzuTl90IMyFkZQD94AaABAg.ARnZKZvV1uZARoJxVMRrUM","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgwdVkMGTV0Bm4DyzA14AaABAg.ARnYtAVO4xVARo8J9b210D","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwdVkMGTV0Bm4DyzA14AaABAg.ARnYtAVO4xVARoACLKyl6v","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxOrRf5qPFFJFe8Sdh4AaABAg.ARnYJ-60ge3ARn_faQsvXl","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxOrRf5qPFFJFe8Sdh4AaABAg.ARnYJ-60ge3ARnseltTP5N","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]