Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@crowe6961 i guess, but its not just artists who know less most tech enthusiasts dont know how art works and what it means its a problem for both sides and no one is willing to sacrifice something for the greater good, i have touched on neural networks not llms but just basics and i am intrigued by the use case for them and i also like art so im the few whos sitting on the lines (although im more leaning towards supporting art than ai) and imo, if poisoning ai is almost illegal then taking data without permission should also be illegal (like cookies i guess), it also sounds more like a problem for the trainer, they are the ones taking the poisoned art for their ai without permission (going on your example, if the trainer took bulk amounts of poisoned ai). the law hasnt made any rules as of yet for ai other than the no copyright one so we cant say for sure i personally would want a default opt out but can opt in law for artists where they are already opted out but can opt in if they wanna to train art, and a law that protects ai from attacks like sabotage, poisoning but imo ai feels like a less priority to me (going off of how we never needed ai but we had art since the beginning, but ig companies need protection.
youtube Viral AI Reaction 2024-09-25T01:2… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgyP9ddZLzh7C5RQqph4AaABAg.A8ewOXJeSVyA8kI6PJQZRQ","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyP9ddZLzh7C5RQqph4AaABAg.A8ewOXJeSVyA8mGsfTEs9b","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyYy3jRiPx2RIA-pH14AaABAg.A8ep9sAEw85A8hZhe-CTiq","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugye_VaVqVZRInePjtN4AaABAg.A8eluAqjb69A8mJ9og23Ik","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytr_Ugye_VaVqVZRInePjtN4AaABAg.A8eluAqjb69A8n4wl5LWxp","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugye_VaVqVZRInePjtN4AaABAg.A8eluAqjb69A8pfaeaTCrj","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgyUBkCwYRdcbBCxJgR4AaABAg.A8ekLljUrlGA8eoml6zeJI","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgxXOG1euFlGg9WEwMN4AaABAg.A8edk_roKGcA8iQufNIxZ9","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxXOG1euFlGg9WEwMN4AaABAg.A8edk_roKGcA8jDbMB4eD-","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyM97HtmuLu8EI1sip4AaABAg.A8ecuEgHIOLA8i2xGFDquk","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]