Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And just remember military is always like 25 years ahead of the general public w…
ytc_UgwvvIy5x…
G
Lol, I really laugh on these people uhhh AI is going to take over. No is not goi…
ytc_UgxIE_pge…
G
Happened to me. I was making chatgpt to make layouts of 21 chapters. At the 14th…
ytc_UgywZ_Sgj…
G
By the end of 2026 you'll be talking to an AI and you will not know because it s…
ytc_Ugz66iSyH…
G
I’d have a lot more respect for anti-AI people if they simply said, en masse, ‘I…
ytr_UgwwPXcHw…
G
If you pause the "AI" program and reduce it to an array of bits, no consciousnes…
ytc_UgwKYhewN…
G
Yes, thank you for not being one of those AGI believers. You had me hooked when …
ytc_UgzKv5C6b…
G
Me. I'm writing my manga with the help of ChatGPT because it's very straightforw…
ytc_UgwyQY9AV…
Comment
Admittedly not familiar with the specific example, but yes this is possible with input bias. The programmers feed a bunch of data into the program for analysis, but its unrealistic to feed absolutely everything in. So there's a few potential sources of bias:
1. Data fed in includes unrepresentative sample of crimes.
2. African Americans are historically more likely to receive lengthier sentences including parole. With longer parole periods, theres more time for individuals to reoffend and violate parole in that time frame.
3. African Americans are more likely to be charged with higher crimes or be offered less beneficial plea deals. The program may therefore see a higher rate of felonies vs. plea deal misdemeanors.
4. Input data may fail to include other distinguishing characteristics, such as socioeconomic status, unemployment rates in area, family, government, or NGO support available for reintegration, etc. If African Americans fare poorer in these other areas upon release, but they are not included in data, then software will note correlation to race when actual correlation is to these other factors.
In general, when you plug bad data into a program, you get bad data out.
reddit
Cross-Cultural
1539187581.0
♥ 66
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_e7jkpus","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"rdc_e7j1brn","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"rdc_e7ipl28","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"rdc_e7ipybi","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"rdc_e7j1qhk","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"}
]