Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
These multibillionaires are pushing the AI revolution because they unilaterally have decided this the future and this future is inevitable. There has been no consultation with wider society. No discussion of benefits and drawbacks, of what society either needs or wants. I am all for mechanizing mindless mindless factory production lines. I am against supplanting human intelligence with mechanical intelligence on the grounds that machine intelligence is superior, Human intelligence defines our species. Say it is not needed and we don’t need humans. Of course, in reality, no one is planning to supplant CEOs, just ordinary working people. This is not a replay of the Industrial Revolution, this is a proposal for mass unemployment, no career opportunities for the young (unless they are the billionaires’ children in which case they will inherit the kingship of the few corporations left), no point in living, and a denigration of the human need for social contact with others. These developments must be subject to scutiny and regulation. They need to benefit all people, not just the super rich. Many SF writers have postulated a future world where the élite few live in gated splendour while the rest live a feral existenc, struggling to survive. I used to think this was fantasy.
youtube AI Jobs 2025-10-08T20:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgznWI6su-ufLzRll_p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwCe8DspozUcdDzXvx4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzTyYf0_LZFrcMJyVV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgzQIt2GqFsUf6ZCE1V4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxITYjJDUz0L9zHkbh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxs2l0Ob58h9nnkwNV4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwPPIH8Vh5P5_4gEQt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxpqQR-AlLK-aGTLRh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxx1_obyxqpAaiWt0B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyqXyc_T0bRj39U10N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"} ]