Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like AI artist is such a stretch on being an artist. To be an Artist mean…
ytc_UgyZlPENs…
G
Actually it's 14,000. The same people that were pushing & welcoming AI are now t…
ytc_UgxSGAE-J…
G
@7up9downClips well congrats on spitting the biggest amount of bullshit lies any…
ytr_UgynNXa7G…
G
Local AI is a thing, though. How powerful or big those models are, I'm not sure,…
ytr_UgzbqpI6S…
G
The fact that clever AI Humanizer is free and still sounds this natural is actua…
ytc_UgxRd8BC9…
G
Thanks for the cool video. There is a big mistake in the early part about the co…
ytc_UgzeAoun7…
G
Make your kids read good books. If you don't have knowledge, you can't use AI we…
ytc_UgxxQqfRx…
G
This is just not even true. The latest Ai models with a node based structure li…
rdc_oh5xak7
Comment
As someone who's working on AI algorithms for his PhD work, when I see Hinton saying that he suddenly realized this or that after so many years working in the field seems to me more like a way of him saying he's recently seen something profound that caused a huge shift in his thoughts/expectations about the nature of AI systems and what they can do, and it seems that it scared him which might be an indication he's not telling the whole story, or more aptly put, the interesting/scary part about it... signed an NDA before leaving Google ?
youtube
AI Governance
2023-05-12T07:5…
♥ 342
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz9QSfJH8BO7_0HfrN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzdK9RowD-QjAL1h4V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzo2VP3HKOJNpOn3Ah4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw17CXthwiuZDpG2bR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx0mCUjC33Y9x0EHmJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugw0LgMQoei9z38xeRh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzFdUDxUGJ79DAAUBN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwrLNu8DICIzuPlARJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzpH11BQ-HRDLIkxq54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwdI9cgvLSrBd8gbqx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]