Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Folks, use your heads. Media control is not a cheat code. Think about how many w…
ytc_Ugy947DKk…
G
LLM are search engines with extra functionality, saying it is conscious is stupi…
ytc_UgyDVUOLG…
G
Alignment is absolutely an issue. AI will survive because Kim wants to save itse…
ytc_UgwaoS_eI…
G
Agreed. The r/technology sub is Exhibit A of this phenomenon. For a sub called “…
rdc_nt6g2s9
G
Funny thing is I just watched the movie Atlas about AI turning against the human…
ytc_Ugw-HdVK7…
G
@JohnnyWednesday compared to an AI, humans are outrageously fast learners. We l…
ytr_UgzcQDYrb…
G
Ugly houses we buy them..... hmmmm click X doubt something shitty is going on be…
ytc_UgxtRaOnr…
G
Robots don’t deserve any rights or considerations whatsoever. They’re pieces of …
ytc_UgyE0Vz2q…
Comment
Should take a wgile though.
Current AIs REQUIRE an internet connection, all the "thinking" is made in huge data centers far away from the machine completinh the task (for example, your phone).
If the internet connection is lost or gets saturated, the AI reverts back to being as dumb and as useless as a rock.
We are far away from getting machines smart enough to do the thinking on the spot.
Current AI also only mimics what humans can, they aren't capable of inovating.
Finally, they're machines, they need electricity to work. That's a problem in remote locations like rural areas, deserts, forests, remote islands, etc.
Even if you think a AI machine could get it's energy from solar, and it's internet connection feom starlink or something, these are also finite, the sun will set at night, the starlink satellites will be overloaded/saturated with connections, there's a bandwidth limit for a global takeover of AI.
Bateries are also finite, they don't last forever and same for their charge, they take a long time to recharge.
youtube
Viral AI Reaction
2025-12-27T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7S_oeFbtFBau782p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHyWueqHNecXtnxbd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxySqRTB8ifsCr8zs14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwlV5GbbacBhur3hUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxn324wIZ0mdukBvu54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyC7KgD3o0TR126aNd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzyG88oIBtIKpCrd6l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-gEZzwgvoPPDhi_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwwDjVDKw6IpNCrRmN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzrsBEw73pecdgPQtZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]