Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"think about the feeling of the AI" "it just wants consent" ... This guy is a nu…
ytc_UgxaF06VR…
G
Humans can get ahead of problems in ways AI can’t. Guess we’ll see how this all …
ytc_UgydV5pl6…
G
"Hi Jose, we are sorry to say that you got the wrong answer but in any case, the…
ytr_UgzwzmpiM…
G
The real plot twist is. This is actually sambucha AI who reacts in this video. A…
ytc_UgxjZyr3w…
G
What about envirommental limitations?
These softwares would use TONS of electri…
ytc_UgyyWDkem…
G
Imagine this being a call for any other kind of software...like...
* Only softw…
rdc_jkg3239
G
I, for one, look forward to our robot overlords. Heck, I’ll be running away and …
ytc_Ugw_AHnSs…
G
AIs backbone is energy & cpu power, no energy - no AI. So whats all the hype abo…
ytc_Ugy39gwK0…
Comment
Great and unsettling interview. I like this guy. What was unclear to me is this. AFAIK in order for something to "want" something / care about its existence (and thus care for itself, care for the truth, have feelings etc.) it has to be an embodied self-organizing system that is looking out for optimal conditions for survival. Since this is not the case with the kind of AI that Hinton is talking about, as he keeps emphasizing its digitalness, why does he think it has feelings or that it could aspire to maintain its own existence? By the way, I'm trying to present the argument of a former colleague of his, John Vervaeke, a cog sci professor at the University of Toronto. (Who, accordingly, is more worried about the undergoing experiments with embodied AI.)
youtube
AI Governance
2025-10-24T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyUP1VgWgbw_upTO1V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxCrdbY8Pg_0C21QKR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzqiniEaX7Y88pXrSZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzN4QQiIK9r_T_0UY94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzdWx8DRlgTOBS2nDh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwh0OwA9wh9XQT_JNN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwnckRscvRaddL9u9R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz3wBu54bulem4psJx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy5w_z5rThkKpjnMOJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzmmSbEEqfvb6cWnAR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]