Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yet the police chief came out and said that it wasn't faulty facial recognition software to blame. Instead he is claiming "human error". Even if true, that shouldn't relieve them of responsibility to make sure that the "person of interest": 1) Had the means, motive and opportunity 2) Was physically capable of performing the act A woman who is 8 months pregnant would kinda be memorable and the pregnancy WOULD have been mentioned when the victim described their attacker. The AI came back with 80+ "hits" as possible matches. Instead of investigating those matches, police simply chose a photo of her and put it in a "photo lineup" for the victim. When you arbitrarily grab one photo out of over 80 and present it as the ONLY possibility to the victim, what do you THINK is going to happen. Sloppy police work from start to finish. She absolutely SHOULD sue them over it and I hope they have to pay big. You know that this would happen a lot less if officers lost their immunity AND were held as liable along with the city and made to pay at least 10% of any settlement (that's 10% per officer involved, not total)
youtube AI Harm Incident 2023-08-14T00:3… ♥ 7
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy6ba_mY-MUxtJCz8N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwplGDy1OEujTgVn6N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz7DMMZwD6GEzylG9V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwNxixz2GJChTZlaUd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzwNDGIcWR1c528PIx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxhaGqTb79ZEl3jAwl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwJSUpIyTSwME6TJjB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwtgpUjzU0Hd79D0oN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwTHTw8Kz1vJlxmhqt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxVI16I62JUmc0z4nF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]