Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
When fire was first discovered and harnessed by the primitive or early humans, many among them must have feared that it would bring destruction or even kill all of us. Some may think I am oversimplifying, but this reaction is remarkably similar to what I see among AI doomers today. Yes, fire has indeed caused immense harm throughout history—directly and indirectly—especially through conflicts and wars. Yet fire ultimately became a transformative force, elevating human civilization. Analogically, the same applies to AI: the fear of extinction echoes ancient anxieties, while the transformative potential remains far greater than the risks when understood and governed wisely. Here is my background. By age STUART RUSSELL 10 years younger than me. In my early 30s in those times, I moved away from the Marxian notion of the “supremacy of human labour,” arguing instead that the future of human society would be shaped by scientific and technological progress, leading to the obsolescence of human labour at some point in the future. The redundancy of human labour, I argued, would liberate humanity from all forms of labour exploitation, thereby eliminating many of the social evils we witness today. This transformation would open the path to a new egalitarian order—neither capitalist nor socialist—grounded in collective wisdom and the direct rule of the people. I first articulated this vision in my 1981 publication, An Alternative to Marxian Scientific Socialism: THE THEORY OF REDUCTION IN WORKING HOURS. Today, on my “anticorruption fight” blog, I refer to it more succinctly as the “Zero Work Theory.” Back in the 1980s, I never imagined that I would live to witness the emergence of such a future. Yet here we are today, standing at the threshold of that very possibility.
youtube AI Governance 2025-12-04T14:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwsdFYgY6L-PJ5oZGZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzA15LnbNPTqJRHMyh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugylnc62enH54VED_rp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyZDvhsmEvlSNd_9Ed4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwNHm2vRaPUHIzAu5R4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxtWtYJdmsiuL7FFk94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyT5v0U-Ys2QQsu3s14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgywDQuzeYDLy4l7CgV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzsgWecWwb3BQsQql14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxAz6DcRypQ-2IMKMd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]