Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think a scenario like that is impossible, because of economic reasons and social reasons. Here an example: If a general AI is developed that takes all jobs. Nobody will have money, so all companies will break down. Including the ones using AI. Actually I would think the companies using AI would break down first, since naturally people still need to eat and my believe is that in times of need people would naturally turn to other people before they turn to AI. If a world where people have no jobs can ever exist, the only way I see to do that would be, to make everything free and to install AI Systems to actually replace all jobs, including the maintenance of AI machines. This is not something that could happen in just a couple years. This would need world wide organizations, which do not exist. With the power to rule all over the world, in addition to the invention and implementation of a general AI and specialized Robots and other AI Interfaces to interact with the world in the first place. I do not think something like that would be socially accepted.
youtube AI Governance 2025-09-11T22:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgyGniMSMMEiDi3hRSJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxaaEzKkW4_sEqxiBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwcUNChQXgjKdtjOPd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzojmEL3iAfUZh-Gzp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxKanjqXRiYYQ2b_Th4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw10xhCNmuDJ9Qnpot4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxG3Zt21xU3T1Lm--14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzznkOqgxf5Ube1VfV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwUbWuWtmTHS-kTbbB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy2Dm3zoV0i5JMSTmJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"})