Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also, if you will allow the public in on the development of these AI machines, y…
ytc_UgzPE592U…
G
If someone is thrilled to have AI do their artist job for them like writing joke…
ytc_Ugys0ie_-…
G
adobe ai found it ok to send an ad on THIS video .... just warning…
ytc_UgwWKmp2J…
G
@BigInhalelarge corporations want to pay illegals half as much as Americans and …
ytr_UgxPxzApN…
G
The one time I was rude to an AI it actually put me on timeout for ten minutes a…
rdc_ktrcvmo
G
Don't allow Ai robots to build their own physical objects. Don't allow it in con…
ytc_Ugx0Pmm-B…
G
It seems like the simple solution is to replace 95% of CEO's with AI, duh.…
rdc_n9h7203
G
Well, that more decribes a nonsense task, rather than a real use-case for stable…
ytr_UgwPuWqOH…
Comment
The system is optimised for recall. Because it is intended for Human In The Loop pipeline, BECAUSE they just wanted to actually catch fucking criminals before they strike again. Not automatically, but buy reducing the amount of footage to the actually human-processible size. At some point, scientists should just fucking implement stuff, without caring about some dimwit button-pushers who have no understanding of the stuff they are writing about.
Like, "hey, buy analyzing a crowd of ~8000 people, the system flagged 42 suspects, 8 of which turned up to be actual wanted people. Who would never have been caught at this time without the system, because it is much easier to manually process 42 images and verify them agaist top5 suggested hits than 8000 against the whole fucking wanted database. But hey, lets complain anyway, because some hippies with brains of a snail feel eye-fucked buy robots.
reddit
AI Harm Incident
1562223895.0
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | utilitarian |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_esq4wj1","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"rdc_esrfaqq","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"rdc_esrs754","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"rdc_esqk7hk","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"rdc_esqhhj7","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]