Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@TheShowdown16 I don't think the comparison with weaving is accurate. Firstly, power looms don't compete with artist weavers, whereas AI generators DO compete with artists. Secondly, the power loom has huge societal value because it provides us with something we otherwise could not have gotten (because weaving by hand is tremendously tedious). AI art generators on the other hand have a negative societal value. This is because they do not solve any problem we have, yet they make art in general worthless. I do think your comparison is right in one respect though: the main problem being capitalism. The power loom was not inherently a bad thing. However, at the time the working class hated them. Why? Because the machines caused them to lose their jobs. For them, the power loom was a bad thing. I would say, it's not the power looms that are bad, but the economic system that left the workers to starve so that the factory owners could get more profits. If society were to strive towards human happiness instead of monetary gain, most of these technologies people hate or find dystopian would either not exist or not pose a threat anymore. Your last sentence seems to imply that you think there is just some natural phenomenon called "technological progress" that's completely detached from us humans, our societies and our economic systems. I can't disagree more with that. Technological progress does not exist in a vacuum. Our economic system and societal order have a huge influence on the direction of technological progress. Capitalism incentivizes profits, often contrary to the greater good. For instance, some kinds of medicine are just NOT developed further just because the pharmaceutical companies that fund it do not deem it profitable to do so.
youtube AI Responsibility 2023-01-04T21:2… ♥ 427
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgwcvAWGZjlGxFQkJ6F4AaABAg.9kTZa0b_bKV9kTe7DWwUN5","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwcvAWGZjlGxFQkJ6F4AaABAg.9kTZa0b_bKV9kTpbMWiPnV","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwcvAWGZjlGxFQkJ6F4AaABAg.9kTZa0b_bKV9kU1KeZFF_z","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgzwVP68yYVb13H5l0R4AaABAg.9kTXn4v0vRB9kU4oPe7uza","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgyR9tXNAQuFyyzH_3h4AaABAg.9kTW6AdxRVa9kVauEXgvDI","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgyR9tXNAQuFyyzH_3h4AaABAg.9kTW6AdxRVa9kaEdIPA2J7","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgywLgZ6-QY_ymQGaot4AaABAg.9kTV4g7hHZD9kVcAqKs0wA","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgyyjPT3CoRcpZrjXL94AaABAg.9kTQizooCse9kU0eHeV6-I","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyyjPT3CoRcpZrjXL94AaABAg.9kTQizooCse9n5jruCzIql","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugy0EouRrFxrIzXVdoZ4AaABAg.9kTP_rxSE4j9kWBO2MpdXM","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"} ]