Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Poisoning doesn't really work against training loras. All poisoning does is to confuse automatic labeling systems, and it's fairly easy to circumvent. Any user motivated in using your art for training will do so, even if all your art was poisoned, because they can just distort the images until image recognition works again or they can write the descriptions themselves when everything else fails.
youtube Viral AI Reaction 2025-08-21T10:3…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgwH68rj21yn-3yzpsB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwEyHSxEJxSG9co81x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxLKFfFyatTfR77INN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyVbJmQatOMyeOTgQV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugyn-zWmYeNWIz53UVB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx9daiVEP27NegeU8R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgykT-zlXaIurNu2npJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwKkbPJilGlDO6PB6h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxU6D1z-DjU-rfmQ7x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugz19n5GXXCk6byXToh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}]