Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think Neil avoided answering why the Jetson's robot vision was different than what happened. I think the difference is kind of interesting and I happened to start reading science fiction at just the right time to already know the answer. Science fiction authors in the late 1960s and early 1970s actually did think the car could be the robot. There are a few examples of those. If I recall correctly, in one Jetson's episode, there is a self-driving car. As in, one. Innumerable flying cars that people have to drive themselves, but George's upper management is wealthy enough he can actually afford a car with an integrated robot. Computers were known to the general public in the 1960s and the 1970s, but they were so expensive "nobody" actually encountered them, outside of people whose professions or field of study required them to. It wasn't long since they were only powerful enough to do notable work in the room size configuration. People understood they were going to get smaller and cheap enough normal people could own one in a hundred years (the Jetsons referenced the year something like once during the series. It was 100 years from the production date - long enough the artists would be safely dead if flying cars didn't happen for some reason.) They also understood that computers were machines that thought like people. That's right, according to my kindergarten classmates, AGI just happened with computers. It was part of their nature. They understood computers didn't have emotions, but they didn't entirely understand what that meant. As such, to my classmates in kindergarten, a problem with a car that was a robot would have would be boredom. To be clear, I'm talking about what my kindergarten class thought because I'm just young enough that they knew some better by the first grade. By the fourth grade, some of my classmates had a computer at home. That said, many still thought that boredom was an issue computers had to deal with, if they weren't being used and were left on. Given that general understanding, having a single robot that did tasks, while all of the devices were more or less as we knew them, seemed to make sense. It was understood that we didn't know how to make the robot work yet, but that was a mechanical issue and a power issue, not a computation power issue or a programming issue, because the latter two concepts weren't ones most people had. They knew that a computer broke a supposedly unbreakable cypher during World War II and did not properly understand how difficult it is to drive, or for that matter make toast if the bread is still in the breadbox. What I found most interesting about the Jetsons was, they apparently thought that people would be *able* to drive a flying car in the traffic that thy depicted. George flies his car without any significant concern and a level of precision that really would require computer control, even if it were just being depicted in two space in a land car. He never swerves in the slightest unless the plot calls for it, along any axis. He accelerates very quickly and comes to very abrupt stops very close to the vehicle in front of him. I live in a major US city and I've seen a lot of traffic. Not only am I not that skilled a driver even discounting the third axis, I have never seen anyone who is that skilled. And yet, I have heard a lot of people asking about their flying cars. Like they could possibly drive them.
youtube AI Moral Status 2025-07-25T08:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxeSAKLAVJUbIey6dp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxJbud1iZ67obZk5kx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyynjPaKux8iHMiVhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy1Wc1gtiB76xV3hg14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwLDcfUPR0DMz2tZB14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwFDRD--keH_dt2sil4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyGF_PWuiti9omIVbJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw6ruBlUmlCDZMWv054AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxToSmdUI55Ar7oCyN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyaGO4Zzra185GOb0p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]