Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The same people who tell you today "Tax us more, we will leave" are desperately …
ytc_UgwJO2C9c…
G
@patiopooper never said it wasn’t. All I did was say. An ai can definitely make …
ytr_UgzkuSQdH…
G
I couldn't decipher whether the first guy talking in this doc. was AI or real hu…
ytc_Ugy0Z5Q0H…
G
Only blue collar jobs will survive from Ai. Time to shift from desk to the field…
ytc_Ugz0cn3Fl…
G
Most people fear AI because they know subconsciously that humans created it, and…
ytc_UgzQIRS8o…
G
I like ai art
I have no artistic talent, the talent tools or dedication to learn…
ytc_UgzoG_v-Q…
G
My German view: the internet has only ever been beneficial for people sound of m…
ytc_Ugz-RRamO…
G
that part sent me, I thought maybe he said "mcdonalds" or something and the LLM …
rdc_n0lyweq
Comment
In the past facial recognition has trouble with certain races either detecting them or telling them apart. It's one of the biggest reasons Apple face id had a huge problem in Asian markets. Look up "Apple face id China". It was such a huge problem. Remember, Apple marketed face id as more secure than a thumb print because faces are more unique than thumbprints. Im not going to debate if that's true because I don't really care. But when the new iPhone with face id launched in China, face id could not tell people apart so it ended up being insecure as a result.
HP had the reverse incident some time ago where their all in one's with camera tracking technology could not pick up dark skinned people. There are youtube videos where a white person standing next to a black person but it only works for one.
Why is that? Well just like many things historically, these items were only/majorly tested in white populations. Another example of that is laser removal or anything that works based on pigments. But when they tested them, it was tested on people with white skin. Tech has gotten better no doubt but that's to show the history of how things can be racist inadvertently.
Now how is this racist was your original question. If you haven't connected the dots up till point, it's real simple. In the scenario of face recognition and say a population like China, apple has shown us that computers in their current state aren't smart enough to tell the difference depending on the population. So if say the NYPD for example decided to use the same or similar technology Apple did and they needed to someone who was of Chinese decent (immigrant/first generation), you have a real risk of failing to identify the correct person if we take what facial recognition as end all be all truth.
At that point, you're being arrested for being Chinese, not because you "look" like the person who committed the crime. To a person who doesn't spend any time with a particular ethnic group outside of what
reddit
AI Surveillance
1600018018.0
♥ 13
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_g54urul","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"rdc_g56d4ns","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"rdc_g54s3e6","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"rdc_g54zp12","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"rdc_g54uzaf","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]