Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This theory doesn't seem realistic to me in that proportion, even for operational jobs. The reason is simple: the production line and technological scalability ultimately aim to sell technology to users and companies that develop other products to be purchased by the population across various social levels. If AI breaks the economic cycle, it breaks the very reason for its existence. There is no way to concentrate capital among so few people without the resulting level of social chaos leading to political interventions. That is where the importance of state intervention lies—in not allowing this to happen. The risk here is that these interventions might follow the same model as free-market capitalism, where underdeveloped countries suffer the full brunt of this process while global economic hubs hoard all the wealth and point an atomic bomb at underdeveloped nations, creating a form of neo-slavery on a massive scale.
youtube AI Governance 2026-04-08T16:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxkR8kFk4g8AQKPvWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwGiNK-UTcRcRWJBbB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwyCqQP9s8_RYnDmWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwKiBUQdX39BMGPiYd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzKBSAbqr-SVR1g_oF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgylRDFlElEh3ZelIQx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzrnTO1DdGQ8RwNYu94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgypnXgcDN5HfavtrCN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy932JTkF5IbMs3DQV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyghNaZ_KsnItaJNbt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]