Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can tell you things about AI that ppl wouldn't believe even if I told them the…
ytc_UgzntbTNd…
G
Art isn’t made for a competition, it’s a form of expression and enjoyment. You s…
ytc_UgxnreYIT…
G
>:c I am now hesitating to put up Godzilla Vs. Ridley on my YouTube channel in f…
ytc_UgxrD9Zl1…
G
I agree with Dave with his predictions. My career as a senior engineer who archi…
ytc_Ugye-XuBQ…
G
ChatGPT said that name of Lyyn Louds father is Jovanka, so there you have an adv…
ytc_UgzAVaJv-…
G
Lavender Town still going strong in 2024, I remember subbing to you back in 2018…
ytc_UgwmvHQDv…
G
LLMs are biased towards returning python functions rather than python classes, s…
ytc_UgywYW11q…
G
I used AI to generate visuals for a documentary series. Directed every frame, ch…
ytc_UgwPgNBYD…
Comment
The problem is that while humans built these systems, we don't control them. We don't even understand how they work. The best experts put the the risk of the complete extinction of humanity due to AI at 10-30% by 2030 and 70-95% by 2050. We have to freeze this technology at the current level and prevent any future development. We have to limit development to narrow AI and ban any development toward general AI. We have to make sure that no one ever creates a super intelligence or it's lights out for the species, maybe all life on Earth and beyond.
youtube
AI Harm Incident
2025-10-13T22:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw1mUlIaPOf8OXce9Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6DtYnybnsd1pi0gh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhwVPuUiOzM6IpeVR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx20rCVRPQfe64tj794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVaY3At6kKO8IQDI54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwt10sOr9_Ln_P83op4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8eFQGRKLX45g2MI54AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxbuE0i0pHvnv683-p4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxAQ8qEL2yYEirL5Zd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLkwYCdeC2PyAQmXd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]