Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
To clarify my point, it's not the technology that matters. What matters is that under Capitalism, humans are expendable resources, therefore Owners will replace humans with automated systems whenever possible. As pointed out in the video, certain software can already compete very well with humans, therefore, someone will say "Why do I need an expensive human, when a cheaper automated system can do the job?" Look at your grocery store - how many humans have been replaced by some flavor of either self-service machine or automatic tally systems? Sony and Disney execs are already talking about replacing thousands of humans with computerized systems for writing, animating and voicing. Owners will displace humans. As far as technology goes, remember that only 24 months ago (2022), "AI" (by any other name) wasn't taken seriously outside a few nerd groups. Now *everyone* can benefit from it. The tech curve may not match Moore's Law, but that's simply hair-splitting. It's evolving geometrically fast enough. Even if we say it "only" doubles every 36 months, in 10 years it will be 8x more powerful. Eight times a billion is Eight billion, which is seven billion more than "AI" needed to figure out how to draw fingers correctly for some hack typing strings into a keyboard. (I could as easily say "8 times a potato is 7 potatoes more than necessary, so don't get hung up on the numbers.) But, to repeat, the main social disruptive factor is not technical, it's usage, and the flaws of our economic models.
youtube AI Harm Incident 2024-05-31T22:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_Ugweg0Hlk5amWE7IHcR4AaABAg.A46Ft6UuMNkA48L-KoLDZf","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgxNhQfo1NtiDl-LhGR4AaABAg.A46FrNP7yFYA46MPF7yJ-Y","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxNhQfo1NtiDl-LhGR4AaABAg.A46FrNP7yFYA46NZrns7vx","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxNhQfo1NtiDl-LhGR4AaABAg.A46FrNP7yFYA476uIZla3c","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxNhQfo1NtiDl-LhGR4AaABAg.A46FrNP7yFYA47DGkiubw3","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_UgyarB4AxK6J1m1hQvN4AaABAg.A46FkjxaGOUA46VkdD0Ugs","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyarB4AxK6J1m1hQvN4AaABAg.A46FkjxaGOUA46oYAre9-k","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UgyarB4AxK6J1m1hQvN4AaABAg.A46FkjxaGOUA4AL8lPjb8k","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyrrEagJSdnFd5Jr1t4AaABAg.A46FAH2qi1VA46IJDQdV6h","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgxVCVjn43ikz6SiH7x4AaABAg.A46F-EOTfXHA46W8KvEi_W","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]