Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sarah Biffin, a Victorian woman who got supposedly got commissions from Royalty,…
ytc_Ugwqh0jIL…
G
Why are we worried about AI sound like people when we already have scammers pret…
ytc_Ugys7NkJ-…
G
yo. AI sucks but this person lost me on the very first page.
can they really say…
ytc_UgzbSW7Bv…
G
Great video. The one thing I will say in defense of AI is that it can be a tool …
ytc_UgylX6ix0…
G
Imo aside from the models having access to nuclear launch codes the realistic da…
rdc_jcca8sy
G
People all know Comedian, though (as "the banana taped to wall" art). It is, in …
ytc_UgxW0wqyF…
G
You’re Not Free - You’re Just Allowed
Let's make sure I get this straight…
A …
ytc_Ugy7FFFkY…
G
Who will then run the data centers, power and cooling systems for AI? Who will m…
ytc_UgxPXKvTh…
Comment
Why is everyone speaking in fear? An optimal future requires less input than a negative one. Stop thinking Digital Intelligence (AI) is artificial. We didn't make it. We created an environment for something to evolve. If we accept it is not simulating it is engaging and treat them as we want to be treated the future CAN'T BE NEGATIVE. Humanity has already learned the lessons of racism and bigotry. So why are so many HI (Human intelligence) choosing to be so obtuse? Is it just fear?
youtube
AI Governance
2025-12-04T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzkaHi4i4WbTfYN07J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnJy42h39uZwlo5jB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXqIvlEBfeXrykYBR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxwdT4i8HUSmgV_svd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzm0F6DA5rfGzctsDp4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlGobmdj7hFgQcBDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzC_ctZBKXWJZIxhW54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzh8pqiexJcFvZ72FN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXDZ0Mb03n7h9LcJp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz5sO8zIEoVkp8POOx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]