Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I do think we wouldn't be so mad about Ai "Art" if Ai was made like a buddy for us humans instand of a weapon for us to abuse other humans. Imagine if instand of trying to substitute ourselves, we tried to teach Ai how to be human? Like, the Ai corporation doesn't steal your art, but asks for you to teach it, so you consent on showing how you do something and correct it, then it shows you what it made, you help correct it and then finally, your Ai buddy learned to make something based on you. And then the corporation uses it, but still has to legally pay you for the work the Ai does. It would make Ai not a Thief, but an assistant we train willinly, consentualy and then was used for the corporation's needs, but also supported the creator. At the same time, it could be used so artists don't have to be overworked, but have an Ai assistant. Notice how i say assistant, not model or anything like that, because an assistant is everything but what Ai is made into, at least in every art form i see. I, and many more, would be much more positive about Ai if it just wasn't made in an effort to substitute artists, if we were given the oportunity of skiping the worst parts, but still being able to put our souls into our work. Now that i look back, i don't think an Ai could really make something for us, but many would like if it could do something inspired by us and still give us the money and credit. Wouldn't be the same, but at least would make it easier and give more support... I don't know if what i am trying to say makes sense...
youtube Viral AI Reaction 2025-08-17T14:1…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgwmIVK-ysr9Fe8RyTd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxhDiZm7dGqY9zs8OB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugws1rnD4BltPr_fS4F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxfu3LlIRqIhR6G0El4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxkJhXn_zan5jhfv0h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzt3CEDtlOAIXP0TMV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxTINB6O00FJ8Ot0_x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz50y6iiyEmuKzdbYp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwY2lnSoJ7QNVeErmB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwYt0zjYas5ZVIEH054AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})