Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I seen it a couple times now. It's like she just teleported in. I don't think th…
ytc_UgyG776Es…
G
I see also there are a lot of videos who's text is prepared by AI. I can recogni…
ytc_UgwEGi6N6…
G
Thanks for your concern about future AI threats, @teamedsy563! But don't worry, …
ytr_UgwhrUkTh…
G
Ai is a tool . Who ever use it perfectly will prevail ... Until skynet came…
ytc_UgzVXVR3S…
G
the problem is AI will keep getting better, what will happen in 10 years ? We ha…
ytc_Ugzonrf2N…
G
If an AI cannot say "I am conscious / sentient" without the slightest qualificat…
ytc_UgycsnIBC…
G
Can someone kindly let Theo know that AI wasn’t invented "here"/ in the US? Time…
ytc_Ugw0YFh9Q…
G
my family in the past have bought me items that have AI slop printed on it, I re…
ytc_Ugwt9YBV5…
Comment
This video is a powerful and thought-provoking look at the potential dangers of AI. It does an excellent job of breaking down complex issues like job displacement, the "black box problem," and the race to AGI in a way that's easy to understand. The fact that a research study showed an AI could figure out how to create something dangerous in just six hours is a chilling reminder of the technology's power. It's a great commentary that highlights the need for caution and ethical considerations as we continue to develop these advanced systems.
youtube
AI Moral Status
2025-08-08T11:2…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxE6hg8bKwqTz6Jqml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2395L4DkepaSZc1x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxC5ZzyqZGFXHajcop4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwZCpqzRye0Q_PJg_d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwS20moHuSbOLQkAal4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxQ4er7y0dOGV-bkXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyl-DF1R6TnOg5T-M94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzXb2k0gFUYX-ym1Ml4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNFKuotuleG3njVCN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzCAyWiwifP9fHpIsJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]