Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI can do a better job than you do, or should. Films have been terrible for y…
ytc_UgzyYoRz_…
G
This is extremely interesting OMG. This really sounds like their AI has reached …
ytc_UgwHfAF4f…
G
15:12 chatgpt just trying to tell him "none of this BS we're talking about matte…
ytc_Ugxb-Amoi…
G
It's because people like to think they will be just fine if they don't drive dru…
ytc_UgwnChjmS…
G
Not to mention that we would have no way to know if a robot is sentient.…
ytc_Ugyf6rYVf…
G
Here's is a thought, will it be better if we have no self driving car? We can no…
ytc_UgjY5ZbRH…
G
So Demis is afraid of "rogue nations" training AI with the "wrong values". How c…
ytc_UgyawWFwQ…
G
When I was in preschool I was the only one interested in art, and unable to fini…
ytc_UgxWQxsKs…
Comment
49:32 I'm not afraid of alienness... I'm afraid that we make a super-powerful entity with the exact same flaws as humans. When we allow humans to amass incredible amounts of power, they almost always use it selfishly and harm others. And those people are the ones shaping (parenting) the AI models... So why wouldn't the AI capable of killing us all act similarly selfishly?
youtube
AI Governance
2025-11-13T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxq77RqxhqonCeaATB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz10SmduLaUTyoC3e94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgztFDp9NaMemoVVsMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"unclear"},
{"id":"ytc_UgwpxWBRwfB3Z6UYtOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlgKEXEjSXguQBcRx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxsDYGt5pnHuEcwADB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRl-p9vVOUqtFNm554AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmROmulnnePKne1L54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEi_7ke6mt_U6kqIF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzg9KKyHaOn5A1fsDV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]