Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
11:22 economist Ravi batra pointed out that as productivity rose but averaging come did not well then the difference ends up being made up through other means. Okay here if we use the UBI will then in order to close the gap between production and consumption will then it will have to be increased because $1,000 a month right now is not very much however in the future when there is perhaps very close to zero employment well then it might take a bit more than that in order to make sure that the economy keeps moving. There's a lot of writings out there in the net and also certain books that talk about this and the reason that to a great extent even since the Great depression and post-war era will then they roll of government spending in the a total economy has risen very much as a percentage of GDP and a lot of this has to do with the improvements and productivity that even came since the Great depression. In a totally automated economy where theoretically prices go down to near zero and unemployment would be near total, well then the UBI has to be increased in order to close the gap between total production and the amount of income that people would need to buy what's being produced. To a great extent it might very well be that the context of the UBI experiment in a pre-automated society is of course going to be rather distorted because the reality of the present time is going to be very different than the reality of the post labor economics that we might be seeing maybe.
youtube AI Harm Incident 2024-09-09T19:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxOFHmDWWzVNCII-Op4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxa6y1gDK2Fd8SVWfF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzVmMN70AypRNwR-1B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyYjGseLdTZyVYBLIV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwM6BJUsO4KY0-9ODp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyAgOoyREhwH0OR0Pd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgyeUVPBMUBXS61hARl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyqLOEe3wMT7d4ApsZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw_eHF0dkdMcNdXWwJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugy7glpcNK5MySJRowN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"} ]