Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You have a point to some degree — any technology can be misused — but your spin into full-blown paranoia isn’t supported by reality. We all carry devices in our pockets that track our GPS in real time. We take photos that embed exact location and time stamps. We install social media apps that demand location data and photo access just to function. And we line up to buy the latest gadgets that could, in theory, spy on us. I’m no fan of Flock. I’ve seen it in action, and it’s a huge waste of taxpayer money — the ROI just isn’t there. In reality, Flock is a network of cameras tied to a database of stolen plates and plates linked to serious crimes like murder, theft, child exploitation, or top-ten warrants. Accuracy? Around 65–75% on a good day. The “track you by bumper sticker” idea is pure movie magic. The cameras aren’t that good, and the system doesn’t have the computing horsepower to process that kind of detail even if they were. If a plate is captured, an alert goes to police — who may or may not have the time or resources to respond. Here’s how it works in the real world: a stolen car is picked up by a Flock camera at an intersection. If someone is actually monitoring the system and an officer happens to be nearby, they might have a chance to intercept. But as Louis pointed out, the effective capture rate per camera is microscopic. Denver could install its own LPR cameras for a fraction of the cost and run its own real-time database with better results. Identifying a car by scratches or dents? That wouldn’t survive five minutes in court. License plates are the only reliable identifier here. The “AI” in Flock is mostly marketing hype — the units run on solar panels and cellular connections with limited processing power. Truly advanced video analytics, like BriefCam’s video synopsis, require massive computing resources. Think $30k servers just to monitor ten cameras in real time, plus careful programming for each search. The more you ask it to analyze, the slower it gets — unless you spend heavily on hardware. So yes, there’s always a theoretical risk with surveillance tech. But the dystopian capabilities you’re describing? They’re not just exaggerated — they’re science fiction at this point.
youtube 2025-08-30T08:3…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyindustry_self
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugz16TiKuftR8OStLTJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwAFbsVJH7Hv9Hfsh54AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxnUAZ9XKwjMTZOoIV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwxkcqNGhev6R7nzfd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxawWOdCUWg7UE_sVl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxdwXtMXng-nu_19DZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyeOneUpkK3Y3RF91Z4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz29W0pS8h5lS4HYLV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwgW6xKBno-l1qfa0h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxyXIJhf_yfpR1ZAYR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"} ]