Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I loved the Robin Williams movie where the android robot wanted to be human ...…
ytc_Ugw2zAAlk…
G
Chatbots and AI are created by people. People with a bias. For instance: you can…
ytc_UgweRXxHj…
G
Yeah I made an ai avatar for free on an app and it looks awesome but, I could se…
ytc_UgzeHx6Lj…
G
so you're telling me that it's user error where people are not paying attention …
ytc_UgyfyvRKO…
G
not to be a hater but hes not saying anything usefull, he says it like he is rev…
ytc_UgzkTEkwp…
G
Man… these SORA 2 ads on all these videos discussing the lazy soulless filth tha…
ytc_Ugz3Gpd8f…
G
Hopefully for us in software/systems engineering AI will augment our work and no…
ytc_Ugy_KDvn3…
G
Programming is 'almost' dead. AI can write sections of the program for you, but …
ytc_UgzR0LUcX…
Comment
Geoffrey Hinton helped build the cathedral, a whole architecture of thought — and now stands at the altar shouting ‘fire.’ AI won’t cause human extinction. But hysteria might. This isn’t a warning. It’s a confession dressed as prophecy. Truth isn’t dying because of AI. It’s dying because we outsourced discernment to headlines and fear. If you regret creating intelligence, maybe ask why you trusted corporations more than consciousness. This video sells panic. I prefer presence.
youtube
AI Governance
2025-09-04T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzrawjQLFj7-Vp21kp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugyte8lNo0WUaxPFhQd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLmzlAchM2zkyJSt14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTBxVFslTZwvOdcAN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxHyswzgHaBRSCIbn14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzw-TcJ3SSxkQ6wGAp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwNU7sP-xuqf5dnlAd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoPBU9NNefmAnN_R94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwvtgS-dwCdKs5QMdd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwD45ohY6vMVH4qdNR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]