Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Will the advancement of AI make some people lose their jobs? Absolutely. Does th…
ytc_UgwEaj_AM…
G
Nobody with training in fine arts would generate ai images instead of drawing th…
ytc_Ugy73Jp8I…
G
By the end of the video they discuss about robots doing basic movments to ease f…
ytc_UgwYAcsLr…
G
I don't think AI and robots will be able to work effectively in manufacturing. …
ytc_Ugy5IYKPO…
G
AI is rlly fun, keep it in silly little cat vids but it doesn’t need to be makin…
ytc_UgzRmmbEk…
G
the little butter robot at the start "what is my purpose?" me- "to server butter…
ytc_UgxxhV09O…
G
Is AI coming for graphics design jobs, tho? Not when you realize that the courts…
ytc_UgxNUBWrI…
G
Beware, ai was trained very well on a specific group of ethnic people. generated…
ytc_UgzKbBm5f…
Comment
It exerts social control.
If you know a location you are planning to visit, or even the route to get to it, has facial recognition equipment that is going to put your face in a large database that can then be used to track you, you will be less likely to visit that location on certain circumstances.
A hypothetical: You are planning on attending a protest or demonstration of some sort. In order to get to the protest you have to drive or take public transportation to the meeting place, then you have to march or move someplace as a group, then you have to protest at your destination.
If each location along the way in this process has facial recognition, and you know that, and you still attend that means you are putting your face in a database that can then be used to track your movements, your habits, other social activities, who you socialize with, where you work, etc.
Depending on how powerful the person or entity is that you are protesting against, giving them that much information could be dangerous, therefore it exerts a social control because you could be dissuaded from what is considered a constitutional right.
There's also the issue of false positives and people being incriminated in a crime just for having their face put in a specific place at a specific time, regardless of their actual involvement in the crime. You may not think that's a serious issue but we don't kind of have a serious issue with law enforcement right now.
Then there's the issue of just storing people's faces and tracking data in a database that can be accessed by outside persons or the information being leaked somehow.
reddit
AI Harm Incident
1563715115.0
♥ 22
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_euddy2g","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_eudeyzp","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"rdc_eudetdn","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"rdc_eudf7y0","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"rdc_eudfq4v","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]