Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think there’s something missing here. The people who don’t have access to robot produced goods aren’t likely to just sit there while all of this happens. Here’s an example to demonstrate my point. Suppose we had automated away most jobs, and then we automated carpentry. If no one can afford the automatically produced furniture, there are still a bunch of carpenters out of work and a bunch of people who need woodworking done. People would probably find some means of exchange with the displaced professionals to get what they need. Basically, people still need to get stuff to survive, and without robots they’ll need to get that stuff from each other. So you have a group that can’t really interact with the modern financial systems, but still need to exchange goods and services among themselves. This would probably end up creating a whole new economy from people producing what they can locally or scavenging through the likely vast amounts of garbage these ultra wealthy would make. I don’t think a French Revolution situation is likely, mostly because the revolutionaries would lose. I also don’t think they would just sit there and starve. With the amount of completely usable stuff industrial societies just throw away, people would probably still be able to make a fair living on the fringes of an automatic economy like this, especially if they got some sort of welfare assistance. Communities have found ways to survive without the help of complex finance and global supply chains for about 12000 years before the last few decades. If the modern economy can’t meet their needs, they will figure something else out. The large populations of the first world make this tough, but probably still doable, especially if the government or some kind of charity helps out. After all, they still have modern scientific knowledge and access to large amounts of salvageable waste. I don’t buy the idea that people will just lie on their couches and starve.
youtube AI Harm Incident 2024-08-03T18:5… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxSsAKk8_vnhJUURfR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwGKnbCmz_PgilFnCx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgxZSbJXk9nPz9gpqJ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwyqoKmf0HPJMJP3kd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzLJevKqHp5CxQ3j8F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugy0R8MtSwEYPAY6dTN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwzMusgdegQusReSDB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxyWGfeVJkHvBYuFh14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzG9XwxMCGEaKJNeaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyi21vlj3DjCqRVnD14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"} ]