Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This argument is not going to hold water with those desperate for AI to give them life saving technology. All our technology has vast costs to the Earth, but we have never yet shown much restraint, and I don't think that angle is going to mean much to most people. Calling it a "lotto" play is also a bit bold. You would have to be banking against a lot of very vested interests spending a lot of money if you think AI isn't going to have profound effects. I agree it is a lotto play, but mostly because it could make things (not just the environment, though I care about that too) much worse. But comparing it to things with "direct benefits today" implies it is a weak bet that it will do much, which seems unlikely. I also think most people (when not writing political slogans, maybe) are fine with the idea that we invest in many things for many time periods, so that won't mean much to most. The threats of AI need to be taken very seriously, not dismissed. Hopefully people will have a better sense of weighing up their possible cures with the potential pitfalls as we get more discussion about AI across society.
youtube AI Moral Status 2025-07-24T15:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgyW6wdZuKeFDWBNaC14AaABAg.AKw1Bh465MLALUCX5TB5sg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyyRmHU9xqUCZYwZB54AaABAg.AKvtoPwlPoCAKvtupetDsC","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgwGBvisktrwYHFlu6l4AaABAg.AKvoBtI_muxAKvqbRygS5L","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgyDehvHl3nnH0DJM_h4AaABAg.AKvkzEqGHhdAKwsRDRDd2-","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugw2GKIxUk892yaABSZ4AaABAg.AKvkrAaHHsPAKwGrYoSLIm","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}, {"id":"ytr_Ugxmdp33praTZigNdTR4AaABAg.AKvkqi6FyV8AKvqybxW2DX","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzzcioOvqVDbHz6FR14AaABAg.AKvkXWb7bFUAKwJlKhp5CG","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzzcioOvqVDbHz6FR14AaABAg.AKvkXWb7bFUAKyE1qas8AT","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzzcioOvqVDbHz6FR14AaABAg.AKvkXWb7bFUAKyFP13tOZm","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzzcioOvqVDbHz6FR14AaABAg.AKvkXWb7bFUAKyHQU5wJHT","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"} ]