Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Automation is never an issue. As long as there is stuff to get done, which there always will be, anyone who is productive will benefit. There are two cases to explain. The first to immediately get rid of is the case where resources do not constrain AI. In this scenario, we are post scarcity and you can just enact whatever Utopian vision you want. This will never happen. However, there won't be infinite availability to AI and it will be constrained by resources and have a positive cost. As that cost decreases, the _comparative_ advantage between humans and AI changes and our productivity frontier expands overall so long as we specialize. This means that the relative value of low-value jobs which aren't amenable to AI yet or aren't prioritized yet will go up and peoplenwiol work these jobs as their wages go up. Remember, automation adds more stuff to the economy and hence the value of those goods and services will go down relative to those not yet automated. Trade is always mutually beneficial and productivity tools always make everyone better off than they were beforehand.
youtube AI Harm Incident 2024-07-29T16:2… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzX5_w0VIziVjTUDJV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxxvn5dEWGH6YA-rlp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugwk-cuVmzOCTVXnv3R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyMT0f7f6JiMM4xVct4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzzMSBd3luJNyVEueZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx_lHbpfVQ2IKY6mNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyj_I-TpRoqB1aEayN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwt5qU4i-acfFWEThh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy0XAEG_3kPw7PJTZR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyCrySiNA9Rt7VYSXl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]