Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate AI art, and that's ok, as AI has no feelings and it has no soul, and it i…
ytc_UgxR5v4Ue…
G
When the singularity happens it will take months for people to vote to put all m…
ytc_Ugz9H6qGH…
G
People stop buying products of a company which replace humans for Ai.. demand an…
ytc_Ugy1wYn0r…
G
This is the first video of yours that I saw, and it made me hit that subscribe b…
ytc_Ugxu9mWK5…
G
@grreeeeee what it says is that regardless of AI, he would have lost his job an…
ytr_UgyGXffqf…
G
What about when AGI is advanced enough to be comparable to humans? Like Im talk…
ytc_Ugw-EWJ7-…
G
I’m happy as long as AI pays our bills, keeps us physically and mentally active…
ytc_UgyCByXdm…
G
A practical and effective and honest interview… wow! Elon trying to share his co…
ytc_UgzKbTqoZ…
Comment
Another question we must ask is, if AI were to go rogue, would humans be of value to them?
I believe the answer is yes. We have unique traits that can only be generated across millions of years of evolution, under very particular, possibly irreplicable circumstances.
That makes us even more valuable than gold or diamonds. We are the sole most valuable resource we even have knowledge of.
Yes, humanoid super intelligent robots are more robust and optimized and efficient, but we are still irreplaceable. We are something perhaps to be harnessed and studied and merged with AI.
We are the blueprint.
youtube
AI Governance
2025-09-04T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxhqE_426KmFhjhgAF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwYj4oNU_L1z_dzlV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQ3aR76kDVIAKr1Hp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxerNOTlZve_bS-N7p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzTpaUcvrsvA6ULtjp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzzfuQ6Ia8RfONN1kF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKKWYhEs9yKFPxecV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw-fwirBT_lhDzx3Wh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzYG4Mco3N54w8knd14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwiQiywrZqiS4KoKV94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]