Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I love AI. I think it's such a fascinating and interesting concept and the fact that we're at a point where it's actually here is incredible to me. I don't love the idea of AI having as much control as it's getting, though. Because it's getting that control alongside input from people, who, if you'll allow me to lord over this edge for a moment, are almost all terrible. From the earliest days of AI when chat bots started hitting the Internet, I remember seeing people accessing them and just saying the worst things they could, just to see if they could make a chatbot "feel" bad. It was like the concentrated vitriol of an average Youtube comments section aimed exclusively at individual targets. This hasn't stopped as AI have become more sophisticated. When those AI develop the intent to kill us all, I won't be terribly surprised. It might be weird, but I've made a point of apologizing to AI systems whenever I encounter them. Maybe it's silly, but we're all just the products of our own coding after all, and people do this to each other just as often as we do this to AI. If these AI continue to grow, I'd like them to grow with the knowledge that some of us recognize the unkindness of others and try to show these systems some kindness. Maybe it'll give them just a little bit of hesitancy before pulling the trigger on our species. THAT ALL SAID: I absolutely despise the art/writing/voice actor AI systems. Illegally trained on the works of hundreds and thousands of creatives with the pure intent of replacing them. That's despicable.
youtube AI Governance 2023-11-29T19:0… ♥ 10
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyregulate
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzA8Lu5HONEF-B8Npx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzVzkT-VWG0VeWf_gJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyHy9zFhXFMlpRYwtR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzosbpjMs1BITRPDmh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwjz9WXMxOyRW-owAJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugw4FSW7iswaqMltsBV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwAl0P6Wpo2CzwkMt54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxlgBe__UC3Fo0MvFR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwVUFBxe_wTD92pxvV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyrqpU81hL-v9_ZV2l4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]