Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"hey guys look i used an AI calculator, i am a mathematician, no i don't know th…
ytr_Ugxovp7VB…
G
thats where your wrong. People ARE making me consume AI content. Whether its thr…
ytr_UgzvQrknB…
G
@itakruI mean when I was “borrowing” art for the home game and a water mark was …
ytr_UgwVAdc69…
G
I hope ai is safe for people to use and they should banned all types of ai avail…
ytc_UgyaPo9Va…
G
Don’t just take my word for it, ask a professional. I’m just trying to help, wi…
rdc_n7ss4ov
G
13:27 we are already told what WILL happen, it will demand worship. You will acc…
ytc_Ugz5KH5k6…
G
AI could never create… what a bad sentence. TODAY, MAYBE, AI is not able yet, bu…
ytc_UgxJYNmNG…
G
Actually in china they were doing a marathon with both humans and robots a robot…
ytc_UgzsPiQ-g…
Comment
Every instance of a false arrest related to facial recognition software that I've heard of has been a Black person. This is at least the 2nd Black person falsely implicated by this technology and subsequently arrested in Detroit since 2020 when Robert Williams was erroneously thrown in jail. Mr. Williams has also filed suit, btw. And James Craig, Detroit Chief of Police at the time Mr. Williams was falsely arrested, admitted that the technology doesn't work.
“If we would use the software only [to identify subjects], we would not solve the case 95-97 percent of the time,” Craig said. “That’s if we relied totally on the software, which would be against our current policy … If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify.
Wtf, Detroit PD? Why are you still relying on this shyt that you know doesn't work? There are even federal studies that have shown that facial recognition systems misidentify Asian and Black people up to 100 times more often than white people. Why is, DPD, or any so-called law enforcement agency even allowed to use such technology when it obviously doesn’t work? Undoubtedly, if facial recognition had such a high error rate with white faces, then there would be far more than just a handful of municipalities that have banned it's use. Hell, there would probably be federal legislation to establish strict guidelines by now.
youtube
AI Harm Incident
2023-08-08T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyZP9a94Ad0wCjUJB94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWR8oQC80pLV-Qh3l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwzPNJTd_wN5s1rrdl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxPxYlJeicDnQ8_vtB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxtNWW_GZxiM9Bqq-l4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxEaIDoNIMcd2f1rzF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwYa1-HAHTLpdMk4U54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzULOv3rWOdF20TfzZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzMGRuFdi8QGkRDJSp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugw1BLcit247NP8ta994AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]