Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
History repeats itself. Progress
and traditionalism combat. There will be paren…
ytc_UgzgYkSZY…
G
@AlanSanderson-p9r You posted a quote by Douglass Hoffsteader which was his view…
ytr_UgyyJDDL1…
G
I completely disagree about this guy downplaying others concerns because theyre …
ytc_UgxrfQQ4U…
G
Gen AI hype guys are ultimately just people who can't really distinguish between…
ytc_Ugyt_OfVl…
G
best coment I ever saw with the ai take with the "you have more time for other t…
ytc_Ugw3c-wRn…
G
i think it’s raising the value of artist made work because the AI took less time…
ytc_UgxSKsSA2…
G
> AIs solution to most problems will be less humans.
That depends entirely o…
rdc_mbv3hus
G
Because we don't really have AI. It's all hype and illusion. Sure LLMs have a …
ytc_UgzdCTCiA…
Comment
First, this kind of thing was inevitable. All tools get misused sooner or later.
Second, As usual when it comes to technology that people don't understand and thus fear, the focus is being placed in the wrong place. Facial recognition is flawed and early in its development. There are some inherent biases in the software that still need to be worked out of it, for example.
But that isn't the actual issue here. The actual issue is not the technology, but how the police investigate crimes. What happened here is the police fed a picture into a computer and it spit out a result. Instead of doing actual police work to determine if that result appeared to be accurate, they went and grabbed the person and then tried to make everything fit their suspect (and failing that, force a confession) instead of having a suspect that fit the evidence. I'm sure that the software only gave a 98% likelihood of a match or something at best, too. Only an idiot would ever give a 100% match even when comparing the same picture as twins exist.
He probably spent 30 hours there because he refused to confess and they couldn't be bothered to check out his alibi.
This is why things like a DNA database and facial recognition are dangerous. Not because the technology is flawed, but because our law enforcement and prosecution intentionally misuse them in order to get the easy win.
reddit
AI Harm Incident
1626269414.0
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_h55vojo","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"rdc_h55hk0e","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"rdc_h53pc5k","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"rdc_h55b36q","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"rdc_h53znx5","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"}
]