Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I don't care if we automate human labor -- so long as we do something like universal income. Because let's be honest. Those "good jobs" in physical labor ends up resulting in many Americans sacrificing their bodies for a living. They don't get to play with their kids like they'd like to. They end up unable to do certain leisurely things and even essential things because of work place injuries. Wait staff have to deal with assholes. People who curse, and assault (including sexual assault of groping and the like) them while they work and they have to smile to either keep that job or not risk death or both. I'd love a world where things like this can be automated by machines and humans can pursue their passions. The arts, science, etc. We need to stop tying our worth to what things we can make for others. It's not "AI and everyone suffers" or "No ai / limited AI and people still have to sacrifice their physical and mental well being at the altar of making barely enough to survive." You want people to have healthier and happier lives -- use the technology to do the dangerous stuff and let people pursue their passions. Let theme experiment and try to build things. Give them the safety net to try something new, to invent something, to research something and through that we all succeed and benefit.
youtube AI Jobs 2025-10-09T02:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyliability
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyhD52aCVVJ_9_5IXx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgxEeQpHZgnpciq2ued4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx2gguhLcR7UD7nQMB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgyFaJXHWpBbkQQRg454AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugyg50cEN8kYDmJKzrd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyzJg7kNOuc1ZB0h-l4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxjDgthLEwnCDD7YIJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzoEFOsF2GHemsGQd94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgzeD7un_RMqZxohc1V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzm-GL3nKYmZJCEH654AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"} ]