Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Personally, I know plenty of people that tell me they might not pursue a CS-rela…
rdc_kyzfazz
G
Why hire a free man when you can get a slave, why hire a slave when you can buy …
ytc_UgwhDOgCZ…
G
If he was taking AI health advice, at least half of his braincells were cooked a…
ytc_UgyPX5Dm6…
G
The point is, people don't people believe in themselves. They don't think ahy am…
ytc_UgyX8XLn8…
G
Grok ai shows that ai images generative are stupid bad. We got people making fet…
ytc_Ugz0zuami…
G
It’s a robot (yes this is late) but they had these at the chargers vs dolphins g…
ytc_Ugwr2Xtrb…
G
No bro . I'm a java backend developer. Things are changing rapidly. Our sr. As…
ytr_Ugx7LCYBJ…
G
Employers are trying to make their employees train the AI so everyone can be fir…
ytc_Ugy7ktRpW…
Comment
This is a terrible idea. Until we've gotten a couple hundred thousand more hours of data on self-driving cars--especially in concert with other self driving cars, the driver should definitely know what the fuck they're doing if something fails.
reddit
AI Harm Incident
1475388076.0
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_d8ai0nx","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"rdc_d8almjm","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"rdc_d8b7vpz","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"rdc_d8ar0o0","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"rdc_d8azx7v","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]