Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm an artist. 99.9999999% of artists make little to no money off their art. of …
ytc_Ugyr5Vr8V…
G
That 7 OH is sold at Marijuana shops in vape and in gummy bears! It's still in t…
ytc_Ugwstyi97…
G
Yeah, like what happened with the Waymo automated vehicles in San Francisco in D…
ytr_UgwvTlI20…
G
as an artist, getting ai to turn smth into "art" and then calling yourself an ar…
ytc_Ugy6s5wo9…
G
I mostly see issue with people selling or posting ai pics.
I'd love to get rid o…
ytc_UgwCoQyJy…
G
I'm fine with humans inspired by other humans because human learning is much muc…
ytc_Ugxmyg2g2…
G
so an explanation for this is ai is trained on data it cannot think for it's sel…
ytr_Ugwp0Gmcb…
G
y'a pas beaucoup d'éthique personnelle dans cette société - ça done super pas en…
ytc_UgwCVc9O5…
Comment
Since many already spoke in regards of McDaniel, I’d like to emphasize that the medical algorithm wasn’t biased against black people. The bases on which the algorithm calculated were, in a first moment, insurance claims data. When using the insurance claims data it ended producing this “racially biased” result, however when given biological data (note that it wasn’t done directly at first because collecting it requires more time) it properly measured patients in need of treatments, and no racial bias was shown.
The algorithm ignored skin color both times, it’s just the data the algorithm used to determine this “racism”.
youtube
AI Bias
2023-11-02T10:1…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugylk6t_HLCoLlSZglN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzpoaEyFLu7JfnysNh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwi6o2ePzDr7jFCTSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxd-1QXNJsOoAaCINB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzccBSdtul2lZ8XzoF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzRlC9JLNb1xtopq4J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz5clJCCuohdvdVwap4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwSNZBOojztzGmcYHd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzY-sPMNqnk1M8BYKd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugzd9KtVGPMgAIJLzht4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}
]