Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why do publicly traded Corporations have to have Audited financial statements by…
ytc_Ugw5xz9uu…
G
I think singularity begins when AI merges with quantum computers. The AI could d…
ytc_Ugwk23Qrw…
G
Aha... Fighting a robot that is designed to box with what I'd guess it has dozen…
ytc_UgyWLuQfc…
G
Dude the term AI airtist is so dumb to me i cannot explain how dumb it sounds…
ytc_UgwXuuY4e…
G
Great points, another thing to think about is ai could not produce what it does …
ytc_Ugx5OFJyD…
G
Here, I will address the possibility of an AI-generated extinction level event. …
ytc_Ugy6ST8Cs…
G
Chess already figured it out 30 years ago.
Yes ai is plain better at it. But th…
ytc_Ugzs1xbZy…
G
That’s the stupidest argument ever. All the people who made those arguments are …
ytc_Ugx7sJArc…
Comment
0:52 I truly do think this is the biggest, most important worry to have. Everything else is still a huge, worry, don't get me wrong, but an existential threat that WILL happen if we continue our present course is upsetting to say the least. I've been worried about ASI for over a decade and a half at this point, ever since I really become aware of the expert literature. I'm relieved more and more people are taking it seriously. Especially needed now because people naively look at modern LLMs and dismiss any future advancements, assuming ASI we're talking about would somehow use modern technology only, which is silly and not the point at all of the worry. The worry is these advancements leading to other, more impactful advancements that leads to true AGI (and swiftly ASI afterward).
youtube
AI Moral Status
2025-10-30T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxxuL0rIDRv6S4onAp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxV2YgRxgdc1F1hK-R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxxMcFp938sqEB2x6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx51tCuxt7S0BiUp614AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwYohzxjxoYmuBkcrV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugy1pg6e_fFmqKOJTHF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxFZpLLvJEtoqFWd654AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwRwTJYJFvGhe5WBGd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhB5pcpXVKzCtGOUx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-BFV-_V6K0ci-9zt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}]