Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ultimately it doesn't really matter if there is nobody left to buy anything, at least in the long run. What will actually happen is as follows though: 1. The number of people who can afford to live comfortably will decrease, as people cram themselves into rental housing and so on in higher and higher densities just to survive. 2. The number of people who can buy products will decrease, but this will be offset by increases to productivity brought on by automation with goods still being produced. It doesn't really matter if only 10% of people are buying your product in 30 years if that 10% of people have 95% of the wealth and you can charge appropriately. 3. Eventually those who own capital can simply shift away from selling goods on the wider market, simply selling goods only their fellow rich care about instead. 4. Eventually the rich can simply produce everything they need to survive and live a great life via automation and AI and so on, with maybe a small number of people still working for them. In the short term, AI taking everyone's jobs isn't a problem for the people who own capital. Because the people still around will still be working - they'll just be working for less and less money, in worse and worse conditions, and it will be quite some time before they can't manage to survive and aren't engaging in economic transactions. By that point the rich and those owning capital and AI and so on will have managed to figure out how to make things work for them.
youtube AI Harm Incident 2024-08-27T15:3… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugwy_74cqrPGlLX442J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugwbzdov16bPhCfTRwN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyvabzPWWTk8Uj7Mid4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyyf3yjo7Y9jj0gABl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgwMjCm2-ft-849AvsZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgysJYZINyDnmaYx1lp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwtDpXKHKDpbllVg594AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},{"id":"ytc_Ugxij0Xi3tTyvDbBDMB4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxM0Y53vZgPplk30SJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwbHnAhQXup0__UQKZ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"outrage"}]