Today I received an official notice from the Social Security Administration that they had overpaid me on some benefits and if I don’t pay them back the Treasury Dept. may withhold the money from any future tax refund that might be due to my SSN, which they conveniently list in the letter. Fortunately, the SSN they listed isn’t mine, but it’s my name on the letter, and my new address – so clearly something has gotten screwed up.
While I was pondering the potentially Kafkaesque phone call I’ll have with Social Security on Monday, I saw some news about the new version of Robocop coming out in February 2014, where, among other things, Samuel L. Jackson’s character talks about Robocop and the use of armed drones as the “future of law enforcement.”
That got me thinking about the trend towards the use of automated tools – from speed and red light cameras to the use of black boxes in cars and facial recognition – for law enforcement. Computers are good for a lot of things, but without a human element to check the result and examine the context of certain behaviors, we may be setting ourselves up for a world we won’t like.
As is clearly the case with the letter I got from the SSA, computers can make errors. Sometimes the errors are the result of human input, sometimes it’s just a bug, but the consequences can be significant and it moves us to a “guilty until proven innocent” world where the computer decides you’re guilty and it’s up to you to prove the computer is wrong.
That was the case for John Gass. As reported by the Boston Globe:
John H. Gass hadn’t had a traffic ticket in years, so the Natick resident was surprised this spring when he received a letter from the Massachusetts Registry of Motor Vehicles informing him to cease driving because his license had been revoked.
After frantic calls and a hearing with Registry officials, Gass learned the problem: An antiterrorism computerized facial recognition system that scans a database of millions of state driver’s license images had picked his as a possible fraud.
It turned out Gass was flagged because he looks like another driver, not because his image was being used to create a fake identity. His driving privileges were returned but, he alleges in a lawsuit, only after 10 days of bureaucratic wrangling to prove he is who he says he is.
But even if everything operates perfectly, do we really want a world where we can be monitored 24×7 by cameras with facial recognition software that automatically churn out fines for misdemeanor-level behavior?
Jaywalking is illegal in many cities, but if there’s no one around and no traffic, we all cross the street outside of a cross-walk or even against the light from time to time. A police officer seeing that would evaluate the situation and decide whether it was appropriate to strictly enforce the law in that situation. Software, on the other hand, sees the violation, uses facial recognition to identify the perpetrator and mails out the ticket – just like automated speed and red light cameras do with license plates.
Right now while driving we only deal with automated enforcement of speed and red lights, but as the technology expands, think about all of the violations you could possibly be cited for while driving that an automated system could be used to monitor:
– Passing on the right
– Failure to turn on headlights when windshield wipers are on
– Failure to use a turn signal
– Failure to come to a complete stop at EVERY stop sign
– and the list goes on…
Pervasive automated enforcement could turn a drive of almost any distance into an expensive venture.
And for bikers the issue could be even worse. Many bikers use discretion when deciding which traffic laws to follow at any given time, among other things because stopping, losing your momentum and unlocking out of clipless pedals at every stop sign and stoplight when there’s no traffic makes biking a whole lot harder.
Going beyond the realm of simple traffic enforcement, you also have to be careful what you admit online. While the case of Jacob Cox-Brown seems pretty clear cut as reported by TechCrunch:
Police made an example out of a teenager from Oregon who boasted about driving drunk on Facebook. “Drivin drunk… classic 😉 but whoever’s vehicle i hit i am sorry. 😛 ,” wrote the clueless 18-year-old. According to local news channel KGW, two people tipped the officers via Facebook about the post. After inspecting the most-likely-profusely-sweating/hungover teen’s car, the damage on his vehicle matched that of two other vehicles hit earlier that New Year’s morning.
And, with their powers of deduction…bam! Handcuffs. The suspect was charged with two counts of “failing to perform the duties of a driver,” but not drunk driving, because a Facebook post is apparently not sufficient evidence of intoxication, according to KGW’s report from Deputy Chief Brad Johnston.
there are plenty of other things people confess online, and an algorithm that searched pictures and posts for confessions of illegal activity and automatically identified the culprits via Facebook photo tagging and sent them a ticket isn’t that far fetched.
Technology has the power to assist law enforcement – but it should be an assistant rather than automated evidence collector and punishment distributor.