Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hope AI fast tracks all of us to our deaths. While the world races to have a s…
ytc_UgyXPgfq5…
G
I’m waiting for the day I can be a robot pimp. Have a bunch of them on a corner …
ytc_UgwetF-WU…
G
Ai "art" defenders think it will magically turn then into artists when no.
They…
ytc_UgxNafB7w…
G
I don't fear AI replacing my job. I do fear some moron in a suit getting busines…
ytc_UgzjY9VqP…
G
what if we built and automated the rails instead?
something that would be actua…
ytc_UgzAKhMC0…
G
6:33 That's because to AI Bros, intentionality doesn't matter. All that matters …
ytc_UgxxxT1ve…
G
He's testing the wrong layer.
Voice-based GPT is intentionally limited by syste…
ytc_UgzhWXMOE…
G
Relation on Genesis: People ar biting the forbidden fruit (apple brand) with AI …
ytc_Ugw0qDB2F…
Comment
Why don’t we not put something stupid as it’s primary goal like “serve American interests”. Why are they testing dumb theories? If you put cocaine on an empty table in front of a child to find out if it will reach for the white substance on the brown table, are you really going to blame the child? Fucking servile losers running the ai experiments. Couldn’t come up with an inspiring challenge without making a repulsive that-fart-smells face , by the looks of it. Who even designs these experiments
youtube
AI Harm Incident
2025-07-24T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxr_5HNsrUrBeoKxCh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyplbT_iN_jV4lCTvt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwIPqz4m8lyrDdNlfF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw92BJwtHOayajwwUp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugymm1Pby04p0TsR6pp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzMhdFOuYgrebB5bnR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9QKuQAXF-DKC4Bed4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzgkE06OxZ32dq3y0x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxxch-kdXxHYdilBnh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNcVnYnKg-NNH03xV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]