Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That second one about McDaniels doesn’t talk about how that AI worked. It’s been mathematically proven that if you know someone who has shot someone or has been shot, you are more likely to shoot someone or get shot. McDaniels had known five people who had shot or was shot, and that made him the most likely person in that Police Station’s jurisdiction to either shoot someone or get shot. The AI doesn’t take into consideration the race of the person, the only thing it did was analyze if someone knew someone, which I think is creepier. VSauce 2 did a much better job explaining this than I did, I believe the video was called “why mathematicians don’t help cops”. Edit: This one is gonna get this comment deleted, but number three is due to the heinous practice of slavery from years back, people of color were genuinely bred to be stronger and better. People of color have stronger immune systems on a genetic basis than those who are white. Not to mention all the centuries of white diseases that only those with the strongest immune systems would’ve survived being dumped all at once. It was a sick and disgusting practice that killed countless numbers of people, and left only the strongest. Number one though is something that needs serious work to be done because that’s really bad shit right there.
youtube AI Bias 2022-12-17T13:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgweSMLxy0SruiYPCbB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzPK7keRcGusxqtKaN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwllGdQLziQMBA9Nkt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyyXTVH8PKKJQoSWWt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwh3X1PGWrhSFR-9VJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzzLpl7VKY4CL5HfPF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxpYPEl-34E1d_JI394AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyIfLh2zBJ3mPWQn4l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxLjsvPYJl36bZBqsB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw3kKe7bQS4rmFfPDN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"} ]