Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Narayanan is like a religious leader... "But you need humans for..." Consistency?!? Really? Has this guy ever used customer service for anything? Human customer service agents regularly suck and are quite inconsistent for most companies. Also, people like him keep moving the goalposts regarding what AI supposedly won't be able to do... therefore they won't replace humans? Sorry, that's a shockingly invalid way to make your argument, especially for somebody working at Princeton. The problem isn't people overhyping AI. We will have problems if AI has not been over hyped and we aren't planning for it well in advance with regard to reshaping our economic systems, and we may be totally screwed if we keep ignoring alignment problems... It's possible that AI will just end up being a tool and jobs will shift to areas that AI doesn't dominate as well. That's yet to be seen. And I don't care about hype, but I do care about the lack of concern for potential bad outcomes.
youtube AI Jobs 2026-04-19T22:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwPYmj2idqwZ86H3W14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgycfUViqxZHCV6qxL54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgzYBf3xTxbStVwePrB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwXUHBFx7G5j-Jk9mp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},{"id":"ytc_UgzY8R4WWffLMpe0ymt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugwm9mP3BF9Dnnmk8EZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgyIPCUIttCboRkqxut4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgxZ5izurTjWiV0FKSN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugypqbjn97y4iAOJdKp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},{"id":"ytc_UgyB7Ix2KykPMRlNs7F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}]