Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Just for confirmation: the backlash against this video is due to extreme skepticism that we will ever reach AGI/ASI? That's what I'm getting from the comments. I feel like this is a strange position to take. Yes, I agree that the projections of the Tech Lords are meant to prop up their market cap. But I'm skeptical of the skeptics who declare with absolute confidence that LLMs will never lead to AGI. Unless you can be close to 100% confident that AGI is nowhere on the horizon, then we really have to think about these things. Even if it's not imminent I think it's instructive to consider where the end-game of capitalism leads. This has _always_ been the goal even if they didn't have a name for it till recently. Industrialization can be summarized as improvements in technology designed to increase labor productivity, which is the ratio of profit to human labor expense. The goal is to reduce the latter to zero, signifying infinite productivity. FWIW I'm a software engineer with a recent background in ML but zero back-end experience with LLMs; I'm just an above-average layman where that tech is concerned (i.e. I've read a few explanations and technical docs and I understand them).
youtube AI Jobs 2025-11-18T23:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz8v3WsUWxo_eJDIGx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz3_YDHhYZGn8VH3sd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxtzU07LM2FNMAN29l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzSfn05hfeFxot9sTp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzbPH7mqXdC8PnzBw14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxuzbSlGMMvpqAxndN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwn6ikYUb3cuKVGjY54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwSpe-_JKi7mQxENnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw7xwwrZU4bjk78OTh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyxV2odpecJjsUcMLJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]