Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This really isn’t going to work long term. The whole reason AI datasets are so large is to fix any weird holes anyway. A lot of AI training data is already poisoned by SEO keywords and algorithmic BS. Looking at the Nightshade article, they had to train a Lora on 300 poisoned images to really make a massive difference - but that’s a dataset of ONLY poisoned images. If there’s 10,000 poisoned images in a dataset of 50 million, it’s a statistical anomaly. Artists would better spend their time working with companies like Invoke AI who want to help safeguard artist rights, and find ways to make say opt-out forms and such to have their dara excluded. Some AI companies already allow artists to opt out - Stable Labs removed Greg whatshisname from their dataset at his request. Yes, you’ll never be able to prevent unscrupulous people from training Lora sets or something. But software companies have likewise had to learn that DRM never fully prevents unscrupulous people from pirating software either.
youtube Viral AI Reaction 2024-11-04T14:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyMd8RbSEnUXJhChFh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxbEBMYtTR1dnzrJJd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgxLwMI2gLsQyF5FqjR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwkAOtFoAVOVNvzLd14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytc_Ugz4_mwpA8tmAylj-Ax4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx3NmgZO6I6xwAfJi14AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugyu-l4SGUvXOyQjZIN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgziofpHglQxBhnrOR14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw28z27FcR-uTmP2x94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxGcaMW7bB-1bDbhSx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]