Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai art invokes the same emotion as you would feel looking at cheap hotel art.…
ytc_UgxwQeXpF…
G
Oh well, when you have the supposed father of ai saying that a proof of its awar…
ytc_UgyYwn0-V…
G
How do you find out if your art has been used for AI? I wanna know if my art has…
ytc_Ugxvk2d5u…
G
I totally agree with Sophia! It’s key to have that human touch in AI processes. …
ytc_Ugw78IczR…
G
Bro, does that make you feel better?? Did you not understand anything he said af…
ytr_Ugyd9XXEZ…
G
Is anyone surprised? They were designed by human narcissist and Huberus. Its cle…
ytc_UgwVsjVoG…
G
Obama, pretending to be black Obama, is absolutely more believable than AI prete…
ytc_UgwwcEP63…
G
Is there anyone talking about AI safety from a worldview perspective? I don't be…
ytc_UgzywSsic…
Comment
It reminds me of Maslow's hierarchy — most people 'can't afford' to engage with AI risk because 'more immediate needs' dominate their attention. When I've raised concerns, people have told me to stop thinking about it because it's stressful and there's nothing I can do anyway. But 80% already disapprove, so awareness isn't really the bottleneck. The bottleneck is agency. People assume the problem is impossible when it's actually just hard, so they disengage rather than keeping their eyes open for opportunities to act. This is a form of rational ignorance — not just avoiding stressful information, but avoiding the ongoing cost of staying alert for moments where you could make a difference. With politics, that vigilance is recognized as civic duty. We need AI awareness to feel like that, not a fringe obsession.
youtube
AI Governance
2026-03-15T05:3…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwv85jUqLdnvC1RcH54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwhAO9rQm0-5ljm34N4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz9Cjj0ZG2G7puDoLd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWtG54ym81l4csruN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyCG4lla29lY7MFQod4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyxfZb5O9s6c5vHuV14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEfIt3A5BaJ1WUz154AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwcE55bOBxHr95Vl6h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzSKdEjnC9PL1tdivh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWbKEFUbM-iEPjtcd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]