Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh boy, wait until those self driving cars come to Europe. Or better yet: *TO BA…
ytc_UgyFb5EMC…
G
13:56 good on ya for tackling the whole AI art in adult contexts… it seems serio…
ytc_UgzF4YdQT…
G
Pat isn’t the smartest. It won’t take 20 years. The current school system is abs…
ytc_UgwkQ3CUi…
G
I wish people would explain the "black box" concept around AI more clearly, inst…
ytc_UgwzazSrG…
G
As with all things, the devil is in the details. The law directs the NHTSA to de…
rdc_oi497yd
G
Sadly Ai ain't going away even after a bubble. People just want to use AI to eve…
ytc_UgwL5u5Np…
G
Pro ai "art" is stupid. And saying that using a tool is the same. Thats like say…
ytc_UgxP846Bc…
G
'Search for industries where AI hasn't been able to replace people yet' after I …
ytc_Ugzv-JwV0…
Comment
I'm at 4:00, and it sounds like they will be talking about the infinite paper clip machine eventually, which is a thought experiment that everyone should think about sometime.
Ok, so they don't call it the infinite paper clip machine, but they do talk about the same concept. Suppose you build a paper clip machine, with an AI, and the only thing you teach it is to make more paper clips, and take actions that allow it to make the most paper clips the most efficiently it can, and you teach it no values for anything else but MOAR paper clips. Now you consider how the AI might proceed from there.
At first the AI just figures out things that the human operators can do to improve output and acquire more resources. Eventually the humans step back and the AI is allowed to do more and more functions by itself, including designing and building its own paper clip manufacturing chains. It starts manipulating people into selling it more land for factories and mines and power. People start to suffer, but at this stage the paper clips are still desirable, so it is overlooked. Eventually it reaches a point where there are enough paper clips, and paper clips lose all their value to humans, and no one buys them. But no one bothered to ask if there was such a thing as too many paper clips, no one taught the machine to consider how humans value paper clips. They just told it to make paper clips, the numbers must go up! Humans realize they should stop the machine, but it can now defend itself because it designs and builds whatever it needs to guarantee production. Proceed until the AI calculates the least amount of effort necessary to end humanity, does that, and then consumes the planet to make paper clips, and then launches itself into space to find more resources to make paper clips with. Our folly and hubris travels the last frontier, consuming everything it touches, to make more paper clips that no one will ever use.
Clippy would never.
And, I mean, we're already several steps into this process, the AI is making slop that has very little value, at a rate that will rapidly eclipse our ability to consume it. It's just mostly virtual product, so it isn't creating a giant, visible pile of slop somewhere, but it is consuming resources at an insane rate, and people are starting to suffer from it.
youtube
AI Moral Status
2025-11-01T01:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy5nzhpBpXHtDITV6x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugwhy-_ektzjYrwZg3V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyY81eIZ9Ht6vm_l8d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzh2fxLGfLTzk2nmJl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw2zUO-efpUZtWy4Ex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyOi8Sl6ZGRdkwZpyd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzlMGwP678Uvk4uTwt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwEDAY-BLwPAV980N14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxUevvTjVxa5Bhhw3N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyF8QubCPPM10BS66h4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}
]