Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm a software engineer at Flock Safety. I've seen a lot of discussion about our technology, and I wanted to offer an insider's perspective to correct some common misinformation. My goal isn't to "sell" you anything, but to provide facts so we can have a more informed conversation. First, here’s what our cameras DON'T do: - They DO NOT have facial recognition. The AI is specifically designed to capture vehicle details, not people's faces or identities. - They DO NOT track ordinary residents. The system isn't for monitoring who is coming and going from a neighborhood. It’s a reactive tool used to search for specific vehicles after a crime has been reported. - The data is NOT a free-for-all. Law enforcement agencies need a valid case number and reason to search the data. They can't just browse it out of curiosity. So, what DO they do? - The system is designed for one main purpose: to help law enforcement solve crime by providing objective evidence. - It captures license plates and vehicle characteristics (e.g., make, color, type). - It provides alerts on vehicles that are on official police hot lists (like stolen cars or vehicles linked to an Amber Alert). It gives investigators a starting point. When a crime occurs (like a package theft or a break-in), police can search for a suspect vehicle description in that specific time and place. From the inside, I see the reports daily of cases being solved—from child abductions to recovering stolen cars—that would have otherwise gone cold. For police departments that are often understaffed, it's an incredibly effective tool. I understand the "Big Brother" concerns. We've all been primed by movies to be wary of surveillance. However, the reality is that strict legal and ethical guardrails are built into the technology. It's about providing specific leads on criminal activity, not monitoring the public. The debate around this tech is important, but it should be based on how it actually works, not on fear or fiction. TL;DR: I work at Flock. The cameras only capture car/license plate data to help police solve specific crimes. There is no facial recognition or general tracking of citizens. The system has strict rules and is designed to catch criminals, not monitor you.
youtube 2025-07-29T00:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxzVBGKhWmjPwPBbVd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyoCg2IUlHpVaJOfTp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzDMhdGis9WKSxEk0V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxrZ5LSh_y_d8WCcMV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgygMBLpzWp_OnYMf3N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwSpqc7dztX0aD4smV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwhsTPniqOAaof5KNt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyX2O_GlhVOkwXbkvZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzDw8cpySBpyDKAo4J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwrZt9sc_lnu5CGUwB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]