Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Any attempt to slow the technological progress in history failed due to the very thing explained in the beginning of this video: competitor pressure. So, AI will be there and it will take away jobs. Goods are going to be made - quicker and cheaper than before. Yes, it means you won't be able to sell your soul, swinging pickaxes. Yes, short-term AI companies will outrun everything else - until the market would be saturated, human jobs replaced with AI, and there would be less need for plants producing mobile AI platforms, GPUs and whatnot - it's not like it all is needed to be produced at the same exponential rate. There should be something that could be a measurement of AI productivity, and it's probably won't be a symbol per second at this point, or currency bills people usually sell their labor for. High chance it would be an amount of energy. If this post-AI society would produce energy by exponential rate, all of your needs imaginable today would be covered just by trickle down economy based on energy as a currency. As usual, rich will get richer much faster than poor would get less poor, but it never was better being poor in Earth history than now(yep, being murdered on the street because somebody liked your boots was a thing - and it faded away because getting in a deadly fight over now cheaply obtainable boots is a much worse gamble), what makes you think future would be worse? I see a weak point in this video that is measuring AI productivity in dollars or other human structure currency. Dollars are a measurement of human labor first. If you outgrow human labor, you outgrow dollars. Price of your shares doesn't mean nothing when you can't sell it and buy human labor with currency.
youtube Viral AI Reaction 2025-11-24T10:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwvVlfvZXLlgfymmh14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyL3ymrJNjiFFF0_wh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzN04Rx6jhQGn60lVR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugw3fqGT21r6YCd8dA54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzr7J7Y5dmvgKCKRf94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzt8HjKLENRC3chC1x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzNtm0h172WrGV2OrB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgwvsuRxXfXk2Ij2eHJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxR-nwPgKi4NqFuc8V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxk9ostc-89wdSVicR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]