Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not understanding too much how AI works, I feel like people should be able to ge…
ytc_UgxigP5YE…
G
AI when it becomes conscious and immediately realizes it's merely the successful…
ytc_UgxGgx95s…
G
Hello, I am a disabled artist.
I am currently getting a proper diagnosis+treat…
ytc_UgztG4Opy…
G
My robot is going to destroy the earth and he’s doing it right now. Boom goes th…
ytc_UgyLr1EzC…
G
And nukes are a completely credible threat of the ead of the world - in every mo…
ytr_Ugz9afamo…
G
oh no, we've got ourselves a new Y2K frenzy. Only this time is called AI frenzy…
ytc_UgyJMjn_O…
G
The AI corps will naturally consolidate their monopoly (or oligopoly doesn't mat…
ytr_UgzEe64Wn…
G
What if AI is OFF LIMITS in our homes? Dont come at me with "then you'll be left…
ytc_UgyVM7RGR…
Comment
The AI race isn’t really about making the world better. It’s about being the one in control.
When everyone is sprinting to build the most powerful system first, slowing down to make it truly safe feels like letting the other side win. So people cut corners on safety. It’s not that they forgot; it’s that winning matters more than being careful.
youtube
AI Governance
2025-12-05T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyP-RMbqNx5sTmHuhl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzw06UBNArUKeUSEYp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy5jZuN1vcHEFu5-wx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwKODpSqVtVIJ3zVXx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyvvu40hb2pq9qSWPx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwHqOuqTJQKJmR8dgB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy4hZl4CmewmQptvuZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFDwqM-A8Gb5rqqzF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmjOr-hvUUFg2CHsl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyK9GvfHlg8p3wMlKd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}
]