Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is one undeniable fact, AI doesn't care. When AI determines humans are the…
ytc_Ugw-Xd9KK…
G
I write code with AI daily. It’s not unmaintainable at all. Maybe 6mo ago it was…
ytr_UgxVExn2v…
G
Yeah, if every normal artist stopped suddenly then the ai would be trained on ot…
ytr_UgzNXhJuI…
G
Failed logic... "If it's dangerous, we aren't going to build it. Right?" Tell t…
ytc_Ugy-rkomG…
G
AI can't get inspired since it doesn't have any actual intelligence or conscienc…
ytc_UgzuPSBLm…
G
I dont care how automated they get, I want a human in each one to supervise and …
ytc_UgwY43111…
G
The guy who said his AI video was cutting edge was correct. One of the best shit…
ytc_UgylA9y52…
G
If you all knew what I knew you would pull the plug on the Internet right now. W…
ytc_UgxaS56WN…
Comment
@41-Haiku To be clear, I actually do take the existential risk of AI seriously. When it comes to humanity's chance of self-extinction, I'm somewhat of a pessimist, so the only real debate as I see it is whether it'll be climate change, mutually assured nuclear destruction, AGI, a pandemic, or something entirely out of left field (joking, but only somewhat). This video however, does not take that threat seriously. It equates LLMs with AGI, throws around dates like they're certainties rather than utterly wild, random-shot-in-the-dark guesses, and makes very specific, EXTREMELY unhinged predictions about how EXACTLY AI will end us. Just to give an example straight from the video (and the point where I feel like it went over the top and I had to stop watching), 2028 will be the year that AI (I'm guessing an American made one based on the inferences) rewires ITSELF to become super-intelligent, then quietly allies itself with its Chinese counterpart, at which point they cooperatively start amping up nuclear saber-rattling, start mass-producing (unbeknownst to any of us stupid humans) and then deploying insect-sized drones and then bird-sized drones to counter the insect-sized drones (Wait... I thought they were cooperating... Why are they countering their mutually agreed to plan!?), and thusly, humanity falls to its new AI overlord(s)(not sure if it's one singular AI, several in cooperation, or several in opposition, the whole thing was confusing...). This is the difference between ACTUAL science, SPECULATIVE science, and completely UNHINGED (there's that word again), poorly written science fiction. Nostradamus knew enough to be extremely vague and not set dates, so that given enough time, events would happen that could be shoehorned into his predictions and the more naive among us would label him a genius. Pindex apparently missed that memo, but then again, their handle IS @pindexsf so maybe the SF stands for sci-fi after all... If that's the case, keep doing your thing with your little kung-fu robots and flappy bird drones.
Sorry if this came off ranty, btw. It's not meant as an attack on you, but a criticism of the video/channel.
youtube
AI Moral Status
2025-04-27T05:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugz1H5JJzdwHQPJYo454AaABAg.AHOdwYbILlUAHOfvgFX6SY","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxK-l2ZP41loCLqNx94AaABAg.AHOcXBLemzKAHP-USNUptv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxK-l2ZP41loCLqNx94AaABAg.AHOcXBLemzKAHS2RFtuu6R","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzoRu3_W-UgofvRr5t4AaABAg.AHObNjMEWeOAHOgAuH_kyF","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgznqhngPdcAmHluP_p4AaABAg.AHOY9Jr9jECAHP00izhxzr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgznqhngPdcAmHluP_p4AaABAg.AHOY9Jr9jECAHP24vHOMT_","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_UgznqhngPdcAmHluP_p4AaABAg.AHOY9Jr9jECAHPS3ysGTf3","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgylGmOEwcVrQcjo6H54AaABAg.AHOW14QCSVWAHPVIP6jii9","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgylGmOEwcVrQcjo6H54AaABAg.AHOW14QCSVWAHP_cRWJ4Cc","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgwM5b-WYKTTDKGSnUN4AaABAg.AHOVxjH4jcAAHRi7lYnI81","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]