Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here is what I think is the translation could imply When Ai gets better at every job we don’t need humans and as you guys know from experiences there is no such thing as a free lunch you are not going to get paid for nothing either it will come at a cost which it will I have very little doubt it won’t but here is what else I think might happen they will try to get rid of us bc earlier humans was needed to do things for them now they don’t need that we are a nuisance to them we need sleep food entertainment sex hobbies we get angry we need poop breaks lunch breaks we need holiday rest etc but the AI it doesn’t need many of these things and when we are on ubi it’s like they are paying for people who are useless yes there is an author who calls us useless his name isYuval Harari the WEF Guy called us useless anyway the point is why would they pay us free money at least not all of us hence they will try to get rid of us it can be via the food supply via reduction in population through things like feminism homosexuality etc pandemics and vaccines etc they will at least try to get rid of the pesky ones if not all
youtube AI Governance 2025-11-08T17:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxbxmSFh3LfHyrZhn94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxlGD5Qc27d3NIfpBh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzJ6rL5s012LopURHN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyKUKYvVDn4Sh4Evp54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxZwR_Tn2giQiF3THN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxmBz8i-K5OqsnowVd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzvDND9-XuWWyVFazp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyeXZp6oJ8KMOJKqLx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxgidO6decsGn4k1YB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxAqaFEqOxQ-qMJnd14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]