Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I frequent an indie AI training site. Base models typically require so much data to train that it's really hard for independent trainers to create with consent. However, if a base model is released open source, it lets people create smaller lora addons like the one seen in this video. This logistically is a lot more reasonable for consensual training and actually allows for the creation of resources that promote the people they were intended to mimic. There are a few AI trainers I've seen train on an artist's work as a fan project with their approval. Granted, among the AI trainers I've seen who don't know the artist personally, all of them were only granted permission after the fact. No artist wants to be the one to greenlight an AI project, so the only fan AI trainers at present who are able to exist are the ones who don't ask. So there is a reasonably ethical way to train AIs. It's just suffering from a lot of inherited baggage. PS: Also, some AI users do become artists. That is a thing I can confirm happens. I once even saw an AI trainer quit to pursue art, and his community he built was pretty supportive.
youtube Viral AI Reaction 2026-01-01T02:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningmixed
Policyindustry_self
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgwjRMasnLas5ps5lpd4AaABAg.ARXF5X-03EkARhthwFv67R","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzHIhHRo74uMAWQzdB4AaABAg.ARSOVz6yNQjARWmnsoYRZo","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgzHIhHRo74uMAWQzdB4AaABAg.ARSOVz6yNQjARjYYHC_lIc","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgzHIhHRo74uMAWQzdB4AaABAg.ARSOVz6yNQjARjarYDNhN7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgzHIhHRo74uMAWQzdB4AaABAg.ARSOVz6yNQjARjouuX9bbi","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgxzLiJ5DqCkdi_lMf94AaABAg.ARQVwU2h-80ARRmmJ9ie1G","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzIAxhY94wahXr1f9h4AaABAg.ARPiAnjDCENARRnQWEMdiP","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytr_UgxN_vP-ZSrEDqvAHwV4AaABAg.ARMapN59YjcARMo-cV8vzV","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgxN_vP-ZSrEDqvAHwV4AaABAg.ARMapN59YjcARPSpxdRN5r","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugxm5chAF7Or4Xyp4yx4AaABAg.ARMPuTXzH6hARQLa7H5iUz","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"} ]