Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When all these data centers and ai centers go bankrupt we will have a lot of che…
ytc_UgylVro1k…
G
I dont care i write stories to make my children happy because im tired of rehash…
ytc_Ugza4B3Pk…
G
Senior SRE/DevOps here… being forced into ai use over the last year has caused m…
ytc_Ugz-bXD6P…
G
More annoyed with endless AI advertising on YouTube. Fake people, fake testimoni…
ytc_UgzKqX2_F…
G
Task coordination , work flow, and patient forums can be done by AI automation. …
ytc_UgxA_HVdK…
G
When people are pushed conservation they turn eventually to criminality. The cyb…
ytc_Ugx7OU5Yc…
G
everyone i know who uses AI says they are stressed out and overworked. if AI is …
ytc_UgwT5PDbg…
G
AI is terrifying! And whoever did that to that mom is really messed up in the he…
ytc_UgwitPo96…
Comment
It baffles me that Dr. Tyson is so nonchalant about the risks of runaway AI. Nuclear physics also offered huge promise for energy production, but missing one detail on cooling rod design nearly wiped humanity off the planet with Chernobyl. And we had a far better grasp on nuclear physics then than we do on how AIs work now. If anybody would have that kind of example in their back pocket I would have expected it to be Tyson.
There's a book called "If Anyone Builds It, Everyone Dies" written by 2 of the longest-standing researchers in AI-alignment. I highly recommend reading it to understand this isn't just alarmism.
youtube
AI Moral Status
2025-10-09T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyDtupO9bmltIr2M7N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwR8n1RS7C1QEWDnYx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyNiygHMsonXzIEeuZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxRxdFpGrBn0NrCX6J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwnlU8EL3XRvzkVP7N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxyF_zMoS82yaMyy694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx7IWMhFVEWmx_oxDV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzYf5000temkNKkiWB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxz0Uj7CK4Vtqf3rih4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwHeNwep2Zfve0OQ1V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]