Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Once AI and robotics truly become good enough to begin replacing humans in the economy en masse, two things will occur: 1) The governments of the world's advanced nations will have no choice but to begin creating payment schemes based on IQ. Fundamentally, for example, if any person with IQ below 100 can be replaced with a robot in any job which they can conceivably be trained to do, then this person is de facto disabled from birth. But this person is still a voter. So... These schemes will, perforce, rapidly take on the character of a UBI, especially if the prowess of AI keeps increasing until the majority of people around the world are unemployable. 2) The cost of goods and services will simultaneously drop through the floor. Robots don't get sick, don't have bad/sad/lazy days, don't take vacations, don't have children, can work 24/7/365, etc, etc, etc. So, suddenly, the cost of manufacturing a good begins to asymptotically approach the cost of raw materials plus energy. This means that a UBI will actually become affordable. Basic food, housing and clothing will suddenly be so cheap, that it will not be a problem to provide a comfortable baseline living standard to literally every citizen, regardless of their social position.
youtube 2022-11-03T01:1… ♥ 4
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugysq8achhA7d4rpbgZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwmumZNpM1vMe7bqDh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwaX1x_PbHR-tMp-G94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzDCKfWcgEPaODgUDR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxD9A2ojchZCNW8hRl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgyIeD3YHEtfYox6s9h4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx2gJThxeIoO6-pBcB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgwLyhi-NaHqbz0lVb54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyKeOaUPKhnvbvGCUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxe3QPLMvx3yVFEqf54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"} ]