Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
An exchange with Gemini. Asking why companies would pay employees if AI can do a majority of jobs for free. Stipulating that there wouldn't be enough data scientists and engineers jobs for everyone. Economic Feedback Loop: If corporations replaced everyone with AI and no one had jobs, who would buy their products and services? A mass unemployment scenario would collapse the consumer economy, making the AI's "free labor" useless if there's no demand. This is a significant disincentive for complete human displacement. Governments also rely on taxes from wages and consumption. Widespread unemployment would devastate tax revenues, impacting public services and potentially leading to social unrest. (I Think this one is the most compelling reason why AI wouldn't replace us.) Adding on to that by taking into account that corporations are greedy and there absolute focus on shareholder primacy. I.e things will get worse before they get better, and before any regulations can be put in place. *Your statement reflects a very real and legitimate concern. It is highly plausible that in the short to medium term, corporations will aggressively leverage advanced AI for profit, leading to significant job displacement and increased economic inequality, particularly before robust regulations and adequate social safety nets are firmly in place. This period of disruption could indeed "get worse before it gets better." The key question is how quickly societies and governments can adapt to mitigate these negative impacts and steer AI development towards a more equitable and beneficial future.*
youtube AI Governance 2025-07-04T04:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugyj-rO4qpQBtexKxHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz5T5PxErYnrhtZEF14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy_F0sWsTz3G5HQFSF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwdTEyP5D-e6ylulbV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzW9E33bn3LU6D6UcR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxgFRzFDuMqRsWrXlN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxLId5ib7Nu59Vb0OJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugyjbppvf8pSpnRI1VR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzr999ZXHaFk-PhCSN4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxKyJb5vFXJ2YmzT0F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"} ]