Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even if something is factually accurate we need to teach our algorithms to ignor…
ytc_Ugy0Pb31H…
G
Someone in my school this year deepfaked a photo of the principal naked. So disg…
ytc_UgyXuxkWk…
G
I wonder if this chucklehead realizes that if no one has a job because AI automa…
rdc_m27doxw
G
Ai is like when the characters break the 4th wall so much it no longer exists an…
ytc_UgxMiOICn…
G
I miss when AI art was more glitchy, full of defects and flaws. It was trippy an…
ytc_Ugwc4viyA…
G
pro tip for next time: the very act of using gen ai feeds it, even if you're not…
ytc_Ugx0w1Ik0…
G
There is no such thing as humans or ai's never producing any bugs. This will nev…
ytc_UgxFoxRhV…
G
This is pretty much the same as Skynet. The government and Microsot, OpenAI and…
ytc_Ugz1knuLb…
Comment
Really? Why is Musk being targeted. As if kids can't find this porn through google, Microsoft, etc. Also TikTok, Chat, and other AI. And do you realize the sexually explicit videos on Netflix and Prime? Some of these say, "it's the parent's responsibility to monitor." Doesn't this apply to GROK as well? AI is so new and there are likely bugs to be worked out. I don't believe porn or explicit sexual images should be allowed through Netflix, Prime, or any other site where kids can EASILY get around their feeble safeguards, if they even have safeguards. Where were you all with these other situations?
youtube
AI Governance
2026-02-05T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwym0_MFiI-IjubLip4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOkkVU5ZX3OMAs9FN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgaG2Hy87PtiNi4wV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxGxC8QeP9wEIG1iZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz_yKSW6F6H6uUnuNJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwD5UW-7HFcLrUGLqh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwL3MDK4Mif4yvDk2N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx-k9gMzss0areEthN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgywSXahWkB9kLftMEZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_NX4Qk5jM9p1kgfh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]