Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You're forgetting the cost of the massive inconvenience. In the United States, at least, the facial recognition software appears to control the stoplights. So it holds the light until it either identifies you, or you obscure the camera's view of you, or it times out after something like two minutes. ​ The most important thing to know is that *the shit doesn't work*. So it holds you for an obnoxious amount of time at the light. ​ And it focuses on work trucks and vans, so business owners are paying for the lost time. Maybe only two minutes a day, but if there are two people in the van, and it's two minutes a day, 300 days a year, that's twenty hours of labor a year that's lost to security theater. *And it doesn't work*, and it can be defeated by lowering the sun visor, which is how we all noticed it. ​ It's been happening since 2007, so one business owner I know has lost literally hundreds of man-hours to this bullshit. And it *should* be illegal.
reddit AI Harm Incident 1562204156.0 ♥ -4
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_esq4wj1","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"rdc_esrfaqq","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"}, {"id":"rdc_esrs754","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"rdc_esqk7hk","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"rdc_esqhhj7","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]