Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe people of color are more difficult for it to recognize correctly because d…
ytc_UgzZQyrC1…
G
Obvious to you, not to a robot and bunch of cameras. Driver has full fault, no m…
ytr_Ugx66cQmJ…
G
😂😂😂😂😂..
It's all about patterns 🤣😂😅.
Awe man ... Come on dude .. don't be so op…
ytc_Ugys7TEN7…
G
People of the future calls this time the 'idiot era' for giving rise to AI…
ytc_Ugzb4W4wY…
G
What happens when Quantum Computing meets Artificial intelligence.
Data and dig…
ytc_UgzqskMAb…
G
200,000 copies? lol. How many copies of chatGPT or Google Gemini out there? Appa…
ytc_UgxdTQLt_…
G
I’m of the same opinion ~ ai TERMINATOR ROBOTS 🤖 looking 👀 like human beings, bu…
ytr_UgwYz7pb0…
G
Platooning means a human driver is training the autonomous drivers. When the sof…
ytc_UgxbZYYeQ…
Comment
I didn't make it all the way through because it seems to devolve into just another silly panic spreader.
Why does nobody talking about this cover the actual reason LLMs can never be 'conscious'? They are only doing one thing, as the video states at the start, it's basically just guessing what word should be said next. ChatGPT doesn't have a memory, it doesn't even know about the millions of other questions being asked at the same time. ChatGPT is created, fed your question and some variables, and it spits out the answer and is then destroyed. It has no intent, and we have absolutely ZERO idea how to make an AI with intent.
All we've done here is a pretty good solution to the problem of human-computer interaction via text. It's a huge leap, but it's not even the tiniest step on the road to a conscious AI. It doesn't have goals, it doesn't have a mood, it doesn't even have memory. It's not aware of anything whatsoever. And more to the point, we have absolutely no idea how to make a system that *does* have all that stuff.
The more serious issue is the one of misinformation or propaganda, as the video talks about early on. And the *only* solution to that is the same one we use for every other potentially-lethal technology: We regulate it and people who ignore the regulations go to prison. Like guns, or cars, or lawsuits.
The issue of whether an AI is going to become conscious and secretly execute a plot to take over the world is so far off, it's embarassing to talk about it as if it's an imminent concern. All that shows is that you don't really understand the subject.
youtube
AI Moral Status
2023-08-29T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwDjeOHFJLhUxm02xp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGeet3vFYQMPX4NzN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpXXa3rtpEbhptQ2F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRUh0vWeUDvQTQzL54AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxV7h_cDJhGEX38I0B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8MAjyWO03nLEuXNZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz7x9sPPpWlfVC-Isp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1rNHicqe8ydLz2hJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyF7g9XT-IWSpY7Afx4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxxKtBISzKCL3EkodN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]