Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It really makes no difference if AI will really feel or emulate fake feelings, b…
ytr_UgwcGRjUJ…
G
@Ulthane who knows,fella.. I'm not so sure about how AI gonna effect artists so …
ytr_Ugw4IO0if…
G
More hate. 😡 this man thought this robot to harm humans. I’m not sure how I feel…
ytc_Ugz5HKmT_…
G
Words can't possibly express in full detail and scale the sheer existential drea…
ytc_UgwRXI6e8…
G
I'm a software engineer and I've never wished for a technology to fail more than…
ytc_Ugzr1n9ep…
G
If their argument is that AI inevitable and will replace humans, this just means…
ytc_UgwMurjGS…
G
Man, that sounds just like the amazon hiring AI that they shut down because it c…
ytc_UgxIYwHGX…
G
People have a long and documented history with the machines they have come to re…
ytc_UgwdCLxEr…
Comment
@douglasboyle448 Think about the rammifications and consecuences of what you are saying, the example above is a clear example, and face recognition apps have been called to testify before congress and made explicitly clear, that the technology is racist, meaning it does not identify the same characteristics and with the same accuracy when applied to people of color and specially black people, and now you have police officers using it, in which we all know police are white supremacists, many of them, and the legal system is designed to convict people of color at much higher rates with harsher sentences, those are facts, and when paired with technology that leads to false-positives like this case, you end up with a wrongfully convicter person or worse, another police execution of a person of color. Not to mention privacy violations, so no, there are far more negatives than positives for using this kind of technology, specially in the legal system.
youtube
AI Harm Incident
2021-05-01T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxH5XqgIx9XLYEVwm94AaABAg.9Mm1ylLwa6a9Mm7qDzOMml","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxH5XqgIx9XLYEVwm94AaABAg.9Mm1ylLwa6a9MmSTi2CMkx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxH5XqgIx9XLYEVwm94AaABAg.9Mm1ylLwa6a9MnsmCNdYeN","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgxH5XqgIx9XLYEVwm94AaABAg.9Mm1ylLwa6a9MoFmW0gPYS","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyWJ14lM2UF4yYq9YV4AaABAg.9Mm-IaeHmjg9Mm6xpbtdnr","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyWJ14lM2UF4yYq9YV4AaABAg.9Mm-IaeHmjg9MmNQQ9ICKp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyWJ14lM2UF4yYq9YV4AaABAg.9Mm-IaeHmjg9MnhuBOTFEy","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_UgzcLWJQv5gJja0Ddkx4AaABAg.9MlvPORj07E9MoG-KGuih-","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgylsBnFt2vT91u6Dul4AaABAg.9Mlt7f8-QrN9Mm0BLBQpDV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwEScYzoiBJn3Bg5FF4AaABAg.AU0CCuIImdpAU5VWMUmIqp","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]