Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If they would just license the images for this use case it wouldn't be so immoral. The industry using it instead of people, well, that's just greed and not really the AI itself being the issue. Normal people can't do that kind fo scraping so for the at home enthusiast, we end up more or less just generating images to be used to train. Basically you reject most of the images but but check off on the best and it will essentially steer the training model int he right direction to whatever style you want. The original few big ones that started this whole thing, they are the ones that screwed you all over. You can buy training data that is very well sorted and such, it's just expensive. So IMHO it's more of a moral issue of the original training data set used by the big companies, none of the regular people CAN do what they do. Let's just I love playing round with AI myself, I use Linux exclusively and I have all the software installed and set up, I run a server to store all the massive amounts of data in the level of multiple terabytes. Even I 100% disagree with the way the industry is going right now, I also am a bit of a writer personally, I have been publishes for magazine articles and such but it's not something I am comfortable with trying to do as a sole profession. Unfortunately, what you are doing is essentially just a drop int he ocean but it's a conversation we all need to have and sooner than later.
youtube Viral AI Reaction 2024-10-27T10:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy_BaRzWHwE9pTp91N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyywv1haPSrFFLFxcl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"sadness"}, {"id":"ytc_UgxwhibCRvjjpzAhw0R4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxyRov97LR2Bs7xvqN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"sadness"}, {"id":"ytc_UgxJiGQQAB8R59ro2kF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzP5I87ROaEox8zpHl4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzwpDXPMRltEhzwwSZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx8pdPEMIM-p_OohHl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgzlXd6nEni6KezqEWF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwJxEDCFMV7H27SAG94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"} ]