Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What's important is that it's not used on us...those outside. Hopefully laws can…
ytc_UgxcuSdcZ…
G
I think Google search needs regulations... Ask it a question about global warmin…
ytc_UgykE_VPm…
G
Similar to Medicare and Medicaid, maybe we can have both socialism and capitalis…
ytc_Ugx__aS6P…
G
To clarify about revealing blood type, I can maybe explain, even though I'm Kore…
rdc_clv5d0c
G
I mean it's kind of the same, people who don't care about games won't care about…
ytr_UgwGs7jP4…
G
Don't make AI in charge of much tbh simple work with short contrast and mental u…
ytc_UgxEmaIKS…
G
The fact someone on DeviantArt made over $11 K in a year… selling AI adopts……
ytc_UgzePAni6…
G
Okay Well im scared shitless i dont want to die because of AI. Im still pretty y…
ytr_UgwoO9VFh…
Comment
Having to craft a meta-prompt to get the AI to actually do what you want, which is to help you solve your problem, is frustrating, and you have to start organizing your prompt templates if you need it again. This kind of functionality of understanding of user intent and asking clarifying questions to figure it out should get built into the chat app somehow.
reddit
AI Harm Incident
1751215088.0
♥ 33
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | unclear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_n0h7k28","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"rdc_n0oi65k","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"rdc_n0fg6sd","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"rdc_n0fjcd4","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"frustration"},
{"id":"rdc_n0fgsej","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]