Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i think people are focusing too much on self awareness. By morality experiments …
ytc_Ugyh0PFTe…
G
I don't have any material up on DA but watch a number of artists there. The only…
ytc_UgwOvblmn…
G
Bro the way i always talk to all these Ai chatgpt, gemini,poe,pie is so horrendo…
ytc_UgwhoyQZn…
G
You still need the patient to agree to it. We've all seen the demise of Automat…
ytr_UgzO3ySwn…
G
This guy builds the thing and then warns us It’s all gonna kill us and shit. Wha…
ytc_Ugw5mjezX…
G
Art in every aspect...from music to film making to painting to photography...AI …
ytc_UgzxSmSUe…
G
Too bad the phrase "Final Solution" is a Hitler-related thing because it's obvio…
ytc_UgwDq752-…
G
Isn’t this yet another example of the man-made towers of Babel and golden calfs …
ytc_UgxMiY0D9…
Comment
With deep respect I find Mr Hinton's understanding of modern lifestyles and technologies to be somewhat lacking, Godfather of AI or not, as an example: For years now the problem of AI call agents not having their time wasted has been solved by the bots simply being programmed to answer specific questions only and give pilot refusals outside of those options, very easy... also his 'what if' scenarios are not realistic or practical, nor his understanding of corporate processes and modern commercial enterprises ...iI feel that he got left behind by the technology at some point and is now stuck in his outdated understanding of the dangers. It's like a 1920's engineer running around a modern ship warning everyone of the danger of icebergs, but we solved that problem many years ago after his time working on ships.
youtube
AI Governance
2026-04-17T14:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxTo6_1kciFLCR1fxJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynXPbsdIL_RgNAEDx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQ4l3x_MDCyMu13bB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzVwhgwJIwMoJ5-mBV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz7piCZ54U99HIHlEd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxgfiUP-8YLx4OpgoB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzuwVevs0PBaZmG8U94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwmm7UqMHPVZwbWaFB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugx8qjN7JOfYGjuFc6J4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyYIWW4pMIxHEFfIV94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}
]