Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
one question, even if i use a different texture for my art, will it still poison…
ytc_UgwdYY2Xg…
G
I like AI image/ video generation and traditional art. Working with AI can make …
ytc_UgzYZUbHN…
G
AI is totally a waste of time for anyone who needs a person who knows the answer…
ytc_UgzAwVb-R…
G
AI-artificial idiot
Not intelligent, its just code written by humans. You can ca…
ytc_Ugz4BqGyf…
G
AI is a serious existential risk and it seems obvious to me what we MUST do: inv…
ytc_UgyTGAyJN…
G
Awesome video Ann! AI is taking over content creation too, but that doesn't mean…
ytc_Ugx0lNDy_…
G
17:13 the way that these ads are done makes me think that they were scripted by …
ytc_UgxbjDf-w…
G
If I'm tooling down the interstate for hours on end... Why not play Netflix or s…
ytr_UgwcZTazu…
Comment
Your summation in the epilogue threw me through a loop, all the sudden questioning whether ASI is even possible... is that cognitive dissonance as a result of fear? As Nate and you discuss in this video, the researchers and experts right now debating this subject all unanimously agree, we will eventually create the program that creates the program that develops ASI. The only question is time. That's the AI debate: how long do we have before we create the program that eventually wipes out humanity. Not a single person knowledgable in this field is questioning whether or not it's _possible_ , or even questioning whether we are currently on the path to making it happen. Because everyone knows, it is and we are. Don't forget that and fool yourself into believing it won't happen.
youtube
AI Moral Status
2025-10-31T01:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzFWjPxkVWOsujH9ll4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzT_V6rjZMblmZFKhx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwPqhU1Y94q7MlruVl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwhmHIKsyvU8aT63894AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyagZ-OLXQ1iiUpu-d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"horror"},
{"id":"ytc_UgzfF7u5seJ-9W784G94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwc8cwVmqY2yStK5qp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzm40otCkmJW9KHb0l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyBGAG-3NHjz1r77Pp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugww88gxC1xcl4ZN7Cp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]