Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI for reference You picked the worst examples. There is also AI-Art where you can't tell the difference. And what if I need a reference in a specific style of something that doesn't exist in real life like magic? Lower cost of entry Lower cost of entry doesn't mean "cost to produce an image by myself". It means "lower cost of GETTING an image" for using it in other things for example. Like an indie game developer who needs some background art for the loading screen. Without AI-it would have cost him hundreds of dollars through Fiverr.  "Cost" also means time. I for myself want to create a game in anime-style. If I want to get to the level I would need to, it would cost years. And after that, I would also need years for creating. (because I want to animate every angle) If I would hire artists, it would cost me tens of thousands of dollars to create, just for the art. AI is inevitable: I think you're mixing a few things up here. Whether AI companies fulfill their promises now or are currently profitable gives no indication of the future. Yes, investors are basically being tricked, but that's not our problem. Computing power will become exponentially cheaper in the future. AI models will become smarter and more efficient. In the future, you won't create 50 images and hope that at some point the one you want will be diced out, but you will create 1-5 draft images and then adjust individual points in them until it fits (without creating the entire image again). In addition, you won't then have to generate, code, chase infinitely for infinite images... 20$/mo, but according to usage. E.g. 0.50$ per image, but then this image is the way you want it. So AI is inevitable. I agree on the other points ---------------------- Why is it always everything or nothing? (from both sides) Why couldn't we use it as addition? It doesn't replace learning or practicing.
youtube Viral AI Reaction 2025-04-11T10:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwB3lb4ZsuePuMXLV14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyZeZ4xg4oy26arzjN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxJmtQunE4ozC-CbWx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyy8J6LfJBwAcHr2rt4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy4ohxVvwEzYGmqxQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxvXT6p2kRjgTE_l0B4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyvdxYgkYv7ozTtIwp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx0Ayf94v1SCyGHYFB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwSUbxCepN6BBDDUvR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxczRpGATlVUFm4EbB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"} ]