Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What a disappointing question from Chris Anderson (unless it was asked without naivety, but as a rhetorical resource to give Altman an opportunity to explore his ideas): “Sam, given that you're helping create technology that could reshape the destiny of our entire species, who granted you (or anyone) the moral authority to do that?” Admittedly, that question was suggested by ChatGPT, but Chris Anderson could have refined that idea, just as humans are expected to do with the results obtained through Generative AI. Is it, for practical purposes, a valid, operative question to say, "Leonardo (or Galileo Galilei, Isaac Newton, Nicola Tesla, Albert Einstein)... where did you get (or who gave you) the moral authority to invent what you invented and change the course of human history?"  In a free world, creative power, as long as it is exercised within the limits set by ethical markers (explicit and implicit), that right is not granted by exception. That right is present in each and every free citizen. It's just that, thanks to the cultural mindset most of us have acquired (for example: believing in the existence of a glass ceiling), we don't exercise it. What Sam Altman did is not a crime, just as what Steve Jobs did, or any of the geniuses who came before him, was not. Unless the haters now have our full attention and credibility.
youtube 2025-05-16T18:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgxBX89o_UVsltdAsl54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzz8ajSZO2Y1FmHTg94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwLhN99S14cgh93ynp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwJWmqNxxGypimlnI14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxSKavnMPaX0CI6jex4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugww2IDrJ6TJpIHs80p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwfVkNFVB5hDHnh4SB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz47fXZDVXA7MQpKt54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwCJ5fxcJQnvJxni-l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy0faz8wfldBbZKdkR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})