Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What Neil defines as useful compared to what capitalism defines as useful are two completely different things. IMO, his belief that we won't choose to utilize AGI is bordering on ridiculous. Similarly, his belief that AI won't take our jobs necessarily means he is redefining what 'automation' means. If large parts of the economy become automated, by definition, people are not needed. If people find new economically valuable jobs, then the economy won't become automated. But if an employer can get a computer or robot to do that newly created job for less money (AGI), capitalism chooses the lower cost option every time. Unless we implement a completely new economic system in the next few years that aligns with Neil's values, we're going to go through some rough times. The exponential capability growth in AI is real, and as they said, we aren't wired in a way to meaningfully appreciate such growth. The near future will probably be pretty rough because of economic automation, but if we get through that, life could potentially be super amazing for all of us. I'm a bit more pessimistic in the end but my fingers are crossed.
youtube AI Moral Status 2025-07-23T18:2… ♥ 2
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugy7iTpAjBgvNFodIrl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw3Mr6w--TYoQMzd7l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy5uL99BUDWUZSpt814AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzVrHxzHO8zqTxr2K14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxjXbz2wScnTE4A1jp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwYndSuBgfmlaemQAl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx05Dskt1mjeZW_nup4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy0Phabqni1bd0Dd2Z4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzb7WsW2pllwxy0IXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy9fg_nQKfLKYV_-DV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"})