Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This isn’t that surprising. I’m not an expert on that type of technology, but I have a good understanding. Basically the short story is, light reflects better off of lighter surfaces then it does black, and actually you’ll notice when you see or print small photos of black faces that they come out a lot darker and with less facial facial features than white faces. Basically because of how light works, it’s harder for technology to see it as different facial features. This video seems to made as “they did this to be racist” no it’s just a lot harder to tell dark features then it is lighter features, and this includes human skin. To my knowledge, I don’t if we have good technology to do facial recognition with most security cameras and things like that. Yes with stuff like still photos it’s easier to get good lighting, but you can’t get perfect lighting or anything with store cameras (which seems to be the case in this situation as he mentions retail fraud) Companies don’t have the money for 100 top of the line cameras for every store, and even then it’s almost impossible to get proper lighting because of things like shadows (which really fucks with facial recognition systems on darker skin) Really the only way we would beable to have a very accurate facial recognition system anytime soon would to be for cashiers like myself to shine every customer in the face with a bright light just incase 1 of them decides to steal something. This isn’t a racism thing, it’s just that recognition on darker surfaces is a hard thing to perfect and it’s not even close just yet.
youtube AI Harm Incident 2021-08-15T04:3… ♥ 7
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyaRaHkEYJxqJI6Xud4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxW5y2YdRIU7zf-TMl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxWMtjc_CsEbKtAGX54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyPie-PTm9PFXyPtAR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx5zaGU64xDd7Knvzh4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyfIDdrFKs5A4_xlvx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyBX-KbXO1i8asXJtF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgySug7kBtDcMUVfAzR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxySvwLl8OqW0o89_t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgwwD-q8OiVaDQlPDVh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]