Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think it's... tricky. To say "it's far away from now" does not bode well because technology's development is moving exponentially so what isn't possible today could be very possible in 20, 10, 5, or next year. You can't control that, and once you've opened that pandora's box, there's no going back. I'm a millenial and I have still met classical graphic designers who were trained on drawing all signage by hand, they're not of retirement age and many have lost their jobs to "AI" tools already such as text box tools. Sure, they didn't learn the new tools, but nobody offered to train them, either, they just walked into work and received a pink slip and suddenly the industry has no need for their skills. So, while I am curious about the utility of these tools, I lean towards being fearful of them, because what seems like a long way away from you might be a hell of a lot closer than you realized. AI replacement, or even just jobs exclusively for the people who modify the AI algorithms, could be coming before you turn 30. And *then* what will you do for the rest of your life? Contrary to popular belief on the internet, you're not dead at 30, you're not even midlife. Can you start over so soon into your career? Can you abandon painting in colors for coding? I think the risk-reward is way out of balance, the risk is way deeper and more impactful than the reward. tl;dr, I'm not concerned about the definition and platform of AI art, art always has infinite room to exist, I'm afraid all of the centralizing corporate money will go there and thousands more artists will go unemployed as a workforce. If we had a UBI system or weren't under capitalism, I would not be concerned at all.
youtube Viral AI Reaction 2022-07-18T23:1… ♥ 65
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_Ugz8UXGrPFRjRumrqIN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwCFCpuGhcTtUTK5Od4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugwu65nwLHYoZEuFvuF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyZJf0_6_ECU9PYwg54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzXG27VrEaW4rjUPJN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzmXXQhnxknPHwrrLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgznaTgkn8k39AEF3dd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgyKzsjNrhANvzyYdjt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugzevmngpy1VbfPao1p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz0olFWMu2RHeJTks14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"}]