Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Two things: everyone says that AI is limited by information that is finite, and IA can't generate new information or at least it doesnt have access to private bussiness information, only can arrange what is already public (for instance, public github repositories) so if you learn a bussiness with private information i can't see how AI can replace that, and if you need such AI to learn that information you will need quantum physics computing per human to replace that workforce by teaching them your bussiness, it still will be cheaper to hire a human than buying a quantum physics computer. Also, if Dr. Yampolskiy is right indeed it will be a collapse. In order to keep the economy running you need people to spend money on things (food, clothes, housing, entertaiment, etc). if AI will replaces us all no one will have a job which means no money, if no one has money no one will buy anything, which means for corportations no costumers to sell things. I think we will have a some kind of lobby to not allow AI going too far and keep them as assistants instead of replacing us, so most people can have a job and money to spend. Some jobs needs to disappear tho, anything that doesnt need critical thinking must be replaced by IA and machines in general (factory workers, fast food, waiters, bartenders, etc).
youtube AI Governance 2025-09-07T22:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwKPAWWXeFBhzYK5iR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzaVKx-X0V5BOCx-gd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw0LSoRjZp4smnpFS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwDeEM0RtCxkal29Ah4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzAUD39IGWr7dumf254AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugyur2HVjKWRlAa5hIJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw6yq8Bff8bw8HY6r54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyCFOSjMKw2ZY93VgR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyDG8chLgOu-Av04oV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwtzPHOpX6Qrpf1o7h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"} ]