Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
0:27 Yeah, this broken English kind of reminds me that AI made folks lazier to e…
ytc_UgxYNVvTS…
G
51:23 God bless Neil for the Rosie the Robot memory, and the admonishment of laz…
ytc_Ugzad2GVc…
G
I Will accept the art from an ai when they will go trought all the shit that a p…
ytc_UgzFyTsjv…
G
As someone who uses ai generation frequently. I'm also all for artists finding w…
ytc_Ugwux3HQw…
G
If only the regulatory powers be would mandate an "light" or (emitter that emit…
ytc_UgxJNf81-…
G
If you upload all your medical history to a personal AI which only you have acce…
ytc_UgxGHRNBj…
G
Does this scenario take into account the companies that end up losing reputation…
ytc_UgxFKytwz…
G
Their worried about AI killing us but humans are doing just fine in that categor…
ytc_UgzHusGj6…
Comment
Are you guys who are glazing Kara over the quality of this interview watching the same video as me? I've now seen Hinton twice say that LLMs "understand" and are not just autocomplete or stochastic parrots, which is utter nonsense, and she didn't even ask him what he meant by "understand," let alone call out the nonsense. Then they start talking about how chatbots are "alien beings." This is insipid.
youtube
AI Governance
2025-11-17T19:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw2AJNKtt2OgfjoEZZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyIpUU2aCX9jnFKErZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugznn0C1Fl_NHQR7md14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzxihzLBWVIGPcYWF14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJVVKbAm-A5anHK6V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxrXac_HDq1J2t3OEl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyH-R_r0bUsdCOVRaF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJDkVz0OrSuu4-QuV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxyQl8wQqGyUeyitJt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwoXAkcvF-h0utWPwh4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"approval"}
]