Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly, there seems to be a strawman fallacy in here. I've never seen an AI ar…
ytc_UgxFHq8IJ…
G
Unfortunately a lot of our voters are older, and easily manipulated by tradition…
rdc_da40c7t
G
i dont rlly draw (disability reasons, tho i Texhnically could it hurts too bad) …
ytc_Ugz4M__F3…
G
Even actors will be out of a job soon, just like the fake AI music. Authors bewa…
ytc_Ugx29f6QE…
G
Well, that's totally plausible, I don't get why AI robots with endless intellige…
ytr_UgyJkT_UL…
G
Haven't watched the video yet but if this is 30 minutes to get to the conclusion…
ytc_UgzMBzUUj…
G
Yang Gang!!! You should interview him !Your background in AI will definitely spu…
ytc_UgzUFvUno…
G
A term I learned a long time ago is GIGO. When the time comes, we will fill the …
ytc_Ugw2jEc37…
Comment
I have a lot of respect for Hinton, and you do a good job at asking questions, but why not ask him about a third path: path one--humans and AI co-evolve; path two--AI eliminates humans; path three: AI gobbles AI-produced nonsense and gets worse and worse, but we can't evaluate how it comes to decisions and have already turned decision making and critical thinking over to the AI. John Oliver and many others have been focusing on this recently.
youtube
AI Governance
2025-06-26T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxfWBHGFXjB2Z94ht54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwJbnuWhptat399tLZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYrui-4eQIuhH9iXV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGnYQtlZk_qlPcl5Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxT_TYeDGpExzsU6t14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxDf6m-9O_ii7bhHxt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy3eI72XegwdkZEjph4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyW0lBhmP4FLiDEvTt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwKyb4xhwRK0Clw6nR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyDMLM-y8UFNmyeUNt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]