Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Claiming AI generated art as your own is like claiming a commission you bought …
ytc_Ugw1f6_dc…
G
3:04 It's because they don't care about using technology, they want to never ha…
ytc_UgyPsaYNO…
G
Also the soul of art is in the details. Small, specific details can convey so mu…
ytc_UgwhXyUG0…
G
I was just about to call you out for AI slop until the last line.…
rdc_oi08e0n
G
Not these twats posting a video about AI taking job, made by AI and narrated by …
ytc_UgzBUeuPC…
G
Okay, I'll be straight - Ai has made my creative endeavors ridiculously easy, so…
ytc_UgzrzYeMJ…
G
It’s too bad we can’t use these for burned victims. Giving someone their face ba…
ytc_UgwJVT8PG…
G
company's will figure out if you replace everyone with ai that leaves no one to…
ytc_Ugyd_8xpV…
Comment
Humans are so stupid. Even though it is statistically true that Waymo is way safer than human operated vehicles, humans see one story about a cat being run over by a Waymo and suddenly they want to ban them all even though 1000 of cats are run over by human drivers. We just have to accept machine errors the same way we do human errors. And yes it might look extremely troubling when a Waymo does something a human would never do, like drive in front of a policeman with their pistol drawn but the "Waymo" would probably think the same of a human that drove over someone in a blindspot.
youtube
AI Harm Incident
2026-04-25T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz_Do03kXRNZXy2Q2J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxrq0mfbmFLf5JaKy54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzU1ugdrbrMCOErFS14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwLGKaB5RTNWgBoDpJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQszlgreWqJyNnsv94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwL3dPqgV74b4P0Dj54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwfz9sshphchim_KfR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgycfADOLtTxQ4wkneF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugymq90hkqyuD0zCKNt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzQOjyaLnvj-ItC1QJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]