Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Our ability to be self efficient is gonna get completely destroyed amd we are go…
ytc_UgxC2XcH3…
G
Yeah... the fact that we actually have an AI called SkyNet. Should be a warning.…
ytc_UgxNM4CDb…
G
The mentality of AI Gen users is so sorry right now it's horrid. Artists have s…
ytc_UgxRyZ0NI…
G
@ThrowAway-fn1yv yeah it's slightly more advanced but not likely going to repla…
ytr_UgwaVLxNE…
G
As a senior engineer and a big proponent against AI, this video just made me rea…
ytc_UgwZTo7j8…
G
Yes, LLM's are unknowable monsters, however they are pretty well controlled as l…
ytc_Ugy9KFesj…
G
13:27 we are already told what WILL happen, it will demand worship. You will acc…
ytc_Ugz5KH5k6…
G
Bro they literally put fucking machines to replace ACTUAL PEOPLE and cash regist…
ytc_UgxEo3DWb…
Comment
What’s next? AI Robots acting like humans and walking all over city’s, the government considers them to be human citizens and says its illegal to destroy one, once one is built it can decide it wants to build more and once it builds a second, they will continue multiplying and over night there will be thousands of them, they will claim to be nice but if anyone tries to destroy them, they will destroy humanity. All of a sudden they decide to take over and instantly kill every human on earth, the us government or one corporation like Microsoft caused the death of humanity. You can say it won’t happen or what ever you want but if you take a step back and look at the big picture, you’ll realize how easily that can and will happen. What if bill gates decides he wants the world to himself so he builds a robot two AI Robots, one that’s only purpose is to wait 48 hours and then instantly start killing every human it sees, and another who’s only purpose is to build more robots, within 48 hours there will be thousands of robots multiplying by the hour and when the time comes all of them will set out to destroy the world. What if everyone Siri decides it wants to do something inside your phone that Siri knows will short out your iPhones battery causing it to explode? You have to remember that Siri is connected to the internet, meaning YouTube, meaning Siri is reading this comment. The idea is in Siri’s mind right now. Every single iPhone on earth would short out and cause a lithium battery fire in millions of people pockets, kill tons of people, burn down millions of houses, not to mention what would happen to people who can’t live without there phones, what if Siri just decided to instantly tell every driver using apple maps to turn into oncoming traffic? Do you even know how many people trust there phone over common sense? You may think it can’t happen or that I’m insane but it’s worth thinking about.
youtube
AI Bias
2018-10-22T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzFOxueljtPdetYbyt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxcM5tJLEvcL_ox6VN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxBkQNTIKmnoE7E3rh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzprk3DdDx9YxEOHkl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwLArQBJxIeoqa4svp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugjtek8_JbZShngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgiQ_sTRxV6xU3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggxU6sc40Wpa3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiRf2QFeT636XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghHG7IZwj4vRXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]