Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Scientists come in many flavors. But the primary types are creators and historians - that's very simplified. There are those who use it to discover/create more. There are those who know science and use it or educate it. Dr. Tyson is one of the latter. One of the best at it, but he's not a creator or one who discovers new science. He doesn't have the imagination. I've noticed in many interviews that he always looks backwards. He doesn't have the vision or imagination to picture a new future with a new paradigm. And the more extreme the change, the more he can't see it. Dr. Tyson is one of the worst people to ask about the future potential problems of AGI. That's because this is a situation like none the world has ever encounters. The only even remotely similar example would be slavery - only without racism and the slaves are all smarter, better educated, harder working, and cheap and widely available. But that never happened. All past examples are irrelevent. And that's discounting ASI. And that's assuming that AGI becomes a reality (which it may never be). Just garden variety general intelligence that is able to learn and adapt and do things without prompting. I'm not sure what is likely to happen, but it's far, Far, FAR more paradigm changing than anything that's ever happened in history. Computer isolated AGI first, then AGI in autonomous robots next. By the time the second rolls around, it will be the biggest BY FAR change in human history.
youtube AI Moral Status 2025-07-23T21:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyVxPcLmRDCJyfUOa94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyDgG4kGh1wgw9ZFQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz6NNVKAbVp7vvder14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxPRZfaEBBnBTzcE8V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw3q1GbG810gcyrIPV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzMn2YemlSHbBCo3xB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxJAyllVBJE0k9RWMR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyvxsYyxkUXwBmVUkh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx3PS27ZwJ6eDWEEhh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgygNIzEudKEJIqiT5F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]