Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Where will freedom and liberty for humans and who will have the power control wh…
ytc_Ugz-loyX-…
G
I want to say, Human is danger than everything, becox everything is controlled b…
ytc_UgxCEu2we…
G
Let's not watch it. If these AI shows trended then animators will never get the …
ytc_Ugz4h_otX…
G
Maybe not many users are committed to listen entire podcast. Just so in a force …
ytc_UgyNQfmnJ…
G
@solimm4sks510 Tbh, I think the issue is that Glaze/Nightshade is not perfect, A…
ytr_UgyxFXMm-…
G
My worry is when we start applying human rights to the AI models. Just like we g…
ytc_UgzYz982A…
G
I used ChatGPT to answer some Trig and Calc questions. It's so FUCKING TRASH 😂…
ytr_UgxA19AC5…
G
Is it possible that AI will intentionally deceive us in the future or censor tho…
ytc_Ugww5L6Lh…
Comment
The truth is you can't stop AI and its progress, to many companies/corporations have dollar signs in their eyes. We hope it becomes used as only tools, but if history has taught us anything. Humans take something used for good and use it for bad. That's just how humans are, it's human nature sadly. I don't fear it expect for one thing, putting AI in charge of systems that control weapons on a grand scale. People with power/money think they can control things they create or mange, sadly history has shown that doesn't work, so with AI how would that be any different when AI is smarter than 100 people now, 1 million people in a couple years. That's some scary 💩
youtube
AI Governance
2025-10-02T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyBYvIpaON4sSB0p5h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgybX7CpubnvveZOgtF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx1S_tcWXBqbTlV1Wp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzHUvKfrSO20ro5iEJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwBN1vQ2ntWD0DctUh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx5LHkot8VhU1Kav7N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4Vu6FcD6ICOBn-vt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnlUlFvIPhY7PxjDl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzEJJ_tlRnwzEAxZOB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyVpJ-L19dy8c5mGZN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]