Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
at their core all those AI things are meant to be used in a corporate setting the text analyzing and summaries, the article writing, ... all based on their own environment/input to cater to your specific use case however the marketing for the first publicly available and working generative AI (ChatGPT/DALL-E/openAI) was /so/ good that EVERYONE wanted in on it and thus starting the AI boom we have now* AI in and on itself is neither good nor bad, a lot of systems already used a bunch of AI tools and they're saving lifes! in my opinion AI, especially generative AI wasn't supposed to be this unregulated and this easily accessible The damage it's doing in this mostly uncontrolled space is insane - it's like letting a toddler run free in a giant toy and candy store I don't think I can think of a single use case for text-to-image AI that couldn't also be covered by actual people with some money and that's where our capitalistic hellscape comes in A tool designed to save companies money being let loose on the general public and costing thousand's of creative people's jobs and livelyhoods TLDR: keep the AI stuff in corporate settings, research and medical facilities, etc and maybe some home appliances (washer that determines how to wash, smart thermometer that learns your temperature preferences, etc just some QOL shit) but get it away from the general public TT_TT *correct me if I'm wrong it's been a while since I read up on that /rant over thanks for reading <3
youtube Viral AI Reaction 2024-10-21T16:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgxKDI8lc15Iv4oRe9B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugwa0cUW_j3sm4Ki5VV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwD9Xg8ePUDyUsM2kp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxE_wwzZ90qATtKHPx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwCFT5DB5Oz92KJwox4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw0z6Fwt9Mp06_cAiR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugz7HOcfzUf-7zZG9x54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzMq9tm4FJ38orOmPZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugz01QlX78sTsGRd1k14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgyL8Q_N6lF2aV6pHll4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]