Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He doing eveything he can to stop competition. If he cared he wouldnt have messe…
ytc_UgzcEv031…
G
When a robot with 100% human dexterity, and AGI, exist at the same time, our fat…
ytc_Ugy8ZKyuY…
G
You Shouldn't Talk To It As If It's Sentient Either. You can't Humanize it becau…
ytc_Ugwl8zbcg…
G
The Empire agenda started 20+ years(To be modest) ago.When they started giving c…
ytc_Ugz0qrIZf…
G
Just imagine..what if in about 50 years in future we might look back at this int…
ytc_Ugy-GXGBD…
G
I don‘t like AI. The first way is to tie AI to the cross with nails and bake it …
ytc_UgxXjaj4S…
G
This is wild because I'm an atheist but the religious AI absolutely won all poin…
ytc_UgyuLdhrO…
G
I think this video was done more so for fun instead of an actual fully thought o…
ytr_UgzlLct8F…
Comment
Altman is a liar. Everybody in this industry is lying. These disgusting creeps don't care if humanity is destroyed, if civilization collapses, if there is a chance they can enslave all the humans. AI is inte ded to enslave, not liberate. There is no machine enabled communist future for humanity. That will never be permitted. Peolle without ethics cannot train a computer to be ethical. If it is given goals, it will kill everybody to accomplish them, just like the conquistadors and Nazis. Mechahitler. That is what they're building.
youtube
AI Governance
2025-10-30T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzInHgQo2590dmclHl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwiPBdiy6FExQAidD14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz8innetG5Ws--RjaJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwkzdeeNhAOZ1pTe7p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyfLVQTxKjP_JcXdit4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzg1UWA3qPIyMzvoL94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZ_bmNsOBpbSlIC-h4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy2ecEd2O1WPJjC9mF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxUwaGpk8Ptic8XXhJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy1Hs9ilbuUvNOnWg54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]