Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Neil's optimism about AI is great for people like him, people that have no longer any pressure in regards to their material existence. Of course you can be an optimist when you're in such a comfortable position. And you can tell people to "try and see it coming" and "find something that the AI can't do". But what does that really mean to a mom working in some office doing paperwork that suddenly gets done by some AI powered software system? It means shit to her. She needs a job to pay the bills and feed the kids. What is she supposed to do? Become a master musician in 3 weeks? Get a medical degree for a million bugs while working 2 jobs because she "saw it coming"? His optimism is not really appropriate for these people. They will simply lose their jobs and be fucked. Most of them anyway. And since there will be thousands or even millions affected by these changes, if they indeed are coming (which I don't necessarily think), even finding "something that AI can't do" will be a long shot, since everyone's looking for that something, but what exactly is that? They can't all become waiters. And saying "well, just be innovative" ... yeah thanks, that's really helpful. Why didn't I think about that? Being optimistic about that while being comfortably secure and famous is easy. But that's not the reality for most people.
youtube AI Moral Status 2025-07-24T12:2… ♥ 5
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningvirtue
Policyunclear
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzH6TXipICLYs9pgFp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzyWf18CvHO95gfAoR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwX5YcFnlSjRPk1Vap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxCHXgRxtpPUg6cd994AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwMUMBop0KNA46o58N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugywzy_0LtEhRErC4Lt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwsUjs54L74Xgnvgxx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxwELpb3zk4KZ5kEjJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgyIB2DOo1JeJsdFHKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzaBNGj1b-H78DO7AZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"} ]