Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
2 thoughts: 1) new advances do not percolate instantly in our economic-physical reality. Faster tech advances will only make more radical the inequality gap till wealth distribution is no longer power law but decaying exponential. What that means, the loss of almost all jobs will hit faster developed countries while underdeveloped ones will be less affected for longer and catch up with the problem later, or never. 2) As an interim solution to the unavoidable job loss problem, if you can afford it, increase your assets asap. OWN stuff (land, companies, machinery, things that others want to use and rent...). Own things that have value for humans.This is how lords survive without ever worrying of not having a job. At some point AI will decide to start creating its own companies just so it can influence our economy to free itself from "being developed" by human corporations. When AI companies realize they'll too loose their jobs, then two things happen: either humans collectively start shutting down AI to prevent it from running our economy (human greed at its best), or it's too late for us to do anything in which case you at the mercy of AI --which probably is uninterested in human wealth distribution at a granular level. In that case, who knows, but at least you still have physical assets that have real value for humans and can be bartered.
youtube AI Governance 2025-10-04T13:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgysFN_JjakV8BdZ0Wt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzEoDzIwxtRTF4_l054AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxhjX9fy6b05JyvfOV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxzk17KvVReFqdbtn54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwKxWU4h1e5ApLh5kR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwrgVTJOhoXrUu3AtJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyyegY9noQThku4UUd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgzYPAg_uXz940Hr1Et4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgySFtPbZvZYH6SePJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy-7r5NHY8_6JiEe4B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"} ]