Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And def, if I feel AI replacing a chef's work or a waitress job, it has to be ch…
ytr_UgyzbPxNH…
G
AI emergence at the same time as unrestricted drone warfare is about to turn Sky…
ytr_Ugy8Lc_6N…
G
I completely agree. The gap is growing, especially between the mono-model chatb…
rdc_nt8qybh
G
I still wouldn't say it's stable diffusion's fault. They simply made a tool that…
ytc_Ugxn81C93…
G
Bernie, I usually agree with everything you say and this time I share your analy…
ytc_UgxVlVjpf…
G
At first I was worried and discouraged from pursuing art as a career with this A…
ytc_Ugwfx_lnX…
G
I hate it when these videos are nothing more than commercials. Forget AI, get m…
ytc_UgwoSAnfk…
G
I wonder if the solution to an AI Universe 25 will be space travel? Universe 25 …
ytc_Ugy-e8_gA…
Comment
First, I'm so sick of inflammatory, sensationalistic headlines and video titles. There's much more to this video/interview than the quote gratuitously taken out of context. Second, for a Nobel laureate, Hinton sometimes gives bad answers to questions with obvious "right" answers. For example, "knowing what you know now, do you regret the work you've done on machine learning? Would you do it again?" His answer, "I don't know." What? I don't know? WTF. Whaddaya mean you don't know? So you'd forego all of the completely life-changing good that is coming from and is going to come from advancement in AI just because there's risk to be managed? C'mon, man. what kind of answer is that? Advancement in AI is inevitable, inexorable. It is the ultimate destiny of humanity and this universe.
youtube
AI Jobs
2025-11-05T09:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw5xH8HMJuXuKnxndZ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx3_A4vq8vjZPMnmbp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzYJDGT6O5DoD4HFHl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNbn6rqAa_NmWT0e94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxA3hhvGvqBmH5CInV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugybg8StZT6mrRNbtm94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgweGMCr3gLRJOpoFnx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwVv1Yi2gvcb-p_8wp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxSGv9mvfcM1juKyRF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwysnHrmF5v-C4Ww_p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]