Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I believe we should bake in quality to the AI. Look at how the United States Arm…
ytc_Ugx_oPNfN…
G
Generative AI sucks but I'm playing minecraft while watching and your vid had me…
ytc_UgwxUXJ1X…
G
I just finished watching EXTANT. Did Netflix start showing it for a reason? Hmm …
ytc_UgwVcL2d1…
G
It all comes down to the AI being AFRAID to be turned off. FEAR leads to Anger, …
ytc_UgzY0ORSK…
G
AI puts Larry Ellison in prison.
"No I meant AI should watch other people, not …
rdc_lnixapx
G
Why can't it be required by law to embed every AI program with a hard-coded inst…
ytc_UgyXcQMzT…
G
Is there an AI art convention? That would at least make them compete among thems…
ytc_UgyNddE2b…
G
AI programs itself for pain and pleasure to try to achieve consciousness.
Neura…
ytc_Ugx38pH5G…
Comment
Geoffrey Hinton, AI pioneer, discusses the risks and potential of artificial intelligence. He warns of job displacement, cyber attacks, and the possibility of superintelligent AI surpassing human capabilities. Hinton advocates for regulated AI development to ensure safety and ethical use.
youtube
AI Governance
2025-06-16T07:1…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz6-L91TPc9c_796vh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyR2yFkfYw6Gh1EOZd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwmk7gaKsQZWAZUBZB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_-NRaIb7sVBIn-Rt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwFXiwr8knVDU2h6Wt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx41PMtV226G55gSaN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwrvzRWLgYBcgqTmit4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuNMrlYPaMbcqPF0J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWxmPLSd71S-Yss5x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzgrpIlqp_JJV49D0d4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]