Tomorrow
WhatsJustice
When Big Tech takes on crime
Heads of state, particularly Theresa May, are pushing for tech companies to up their efforts in combatting the bad actors that make use of their platforms or services. Callum Tyndall explores a future where this fight is the tech companies’ alone
It has come into vogue in recent years for world leaders and their law enforcement representatives to use tech companies as their whipping boy in the fight against terrorism and other nefarious folks on the internet. UK Prime Minister Theresa May, in particular, has a tendency to suggest that the fight against criminals is almost entirely hampered by companies such as Facebook.
Following the Westminster attack in June of last year, she told the press: “We cannot allow this ideology the safe space it needs to breed – yet that is precisely what the internet and the big companies that provide internet-based services provide.”
Now, to be fair, May isn’t completely on the wrong foot here: Twitter and Facebook alone are pretty notorious for their apparent inability to deal with hate speech, fake news and the general awfulness you find online. However, there has been a particular focus on message encryption, and WhatsApp specifically, that has potentially further reaching consequences.
Should tech companies be doing more to pursue and remove harmful elements from their platforms? Absolutely. Is some of the rhetoric from those like May also of concern when it comes to privacy rights? Also yes.
However, let us imagine for a moment a world where the tech companies were satisfactorily voluntary in this fight. Let’s, in fact, go further, and imagine that those companies suddenly viewed the management of bad actors on their platforms as their number one concern.
What if they acted to make it their sole responsibility? After all, who is better placed to deal with the problem from top to bottom?
Algorithmic Arrest: We know what you did last summer and we don’t like it
“You don’t need a special theory of motive, or criminals, to understand where and when crime occurs.”
The thing that tech companies have over everyone else when it comes to tackling and targeting those who use their platforms with ill intent is that they know everything. This is, of course, a double-edged sword, at least for those of us who make use of said platforms.
On the one hand, they have vast swathes of information allowing them to highlight potential trouble, identify threats and monitor harmful accounts. On the other hand, they also have all the same information and access to the accounts of people who’ve never done anything worse than leaving nothing but Bounties in the Celebrations box.
The concern is that we have to trust them with this information; we are painfully reliant on them not using all that information for anything bad. If, however, they become the sole arbitrators of online justice, we have to trust that their methods for designating bad actors will only ever cover the truly criminal.
This concern is what has dictated all opposition to large-scale anti-privacy efforts, the notion that there is a hard limit to how much we trust a government or a company. For example, the Snoopers’ Charter was brought into UK law under the assertion that it would assist with preventing criminal and terrorist activity. But the oversight such a piece of legislation provides is based on a certain reliance on the definitions used by the legislation remaining fixed, because if it were to change it could pose a significant threat to the citizenry. What if it’s decided that political protest is a risk? What if visiting sites that are critical of the government was considered a potential threat?
Already, algorithms are being used to aid the course of justice; digital tools tell judges whether a defendant is likely to commit another crime or skip court, provide projections for the police of where and when crime is likely to occur, and rank citizens by their likelihood of involvement in a future violent crime. Aside from the swathes of criticism and concern about the ease with which bias can slip into such an algorithmic system, it is important to consider that while we have a tendency to cast a certain worship on technology and the “genius” of algorithms, they are still created by humans and all that is required to change the outcome is to introduce a convenient new data point to consider.
According to paleoanthropologist Jeffrey Brantingham of the University of California, Los Angeles, who co-developed a model called PredPol that predicts the likely time and location of crimes: “You don’t need a special theory of motive, or criminals, to understand where and when crime occurs, which makes it very amenable to mathematical modelling.”
Just because crime is amenable to modelling, doesn’t mean you can trust the model. But, when Facebook is entrusted with the handling of justice, you will be told upon your arrest, following a version of the Miranda Rights that will presumably resemble a Terms & Conditions document, that the algorithm picked you up. Maybe you hadn’t done anything, but it highlighted that you were likely to. Maybe your misdemeanour has been re-categorised recently as a sign of major trouble. Maybe you just got unlucky and new specifications highlight one of your posts as criminal.
Private security for public protection (or, the life and times of a digital detective)
“Enter the fully privatised, all digital police force of the future. This is justice for all, for a very reasonable price.”
Police forces are in the midst of a battle to evolve and adapt to the digital age. With the constant one-upmanship of cyber security, struggles with funding, and a general lag between legislative advancement and actual technological advancement, it may well be a losing battle. Enter the fully privatised, all digital police force of the future. This is justice for all, for a very reasonable price.
It may be possible that such a police force would be subsidised by government – they are, after all, doing their work for them (think of the private security contractors brought into Iraq and Afghanistan) – but it is also at least somewhat likely that they would seek other revenue streams to ensure the profitability of their service.
It was reported in March of last year that the My Local Bobby (MLB) company was planning to field private police officers in three of London’s richest areas, with the hope that after three months it would begin to take funding from local companies and businesses.
MLB would have essentially served as glorified security guards, but imagine such a scheme on a nationwide scale, with full government backing. Perhaps you have a subscription service with varying rates for varying coverage, perhaps have a charge adjusted to your area’s average income, or perhaps you will simply see the policemen of the future adorned with advertising to cover their costs.
Beyond funding, however, perhaps the principle change will be from that of a human to a robot. Why place people in danger when you can fulfil the same function with plastic? Robots are already in place with multiple police forces, primarily in the role of bomb detection and disposal, but we have also seen Indian police purchase drones to assist with riot control, Israel’s use of a pistol wielding rover and South Korea’s semi-autonomous border patrol death machines.
Police work can be dangerous and the ability to take humans away from that risk is a good thing. Of course, it’s also convenient that a robotic officer can be packed with a ton more tech and can run off another wonderful algorithm, with no concern that it may ask for compassionate leave in the wake of a shooting.
It is likely, maybe even certain, that the coming years will slowly see police forces across the world steadily become more roboticised. If tech companies were to step into the breach, it may even be more accelerated as the companies seek to create a top-to-bottom process that prioritises efficiency and profit margins.
You could have a tweet flagged as a risk indicator and be picked up by Robocop all within the hour.
Of course, maybe this is too bleak a vision of things to come. Maybe we would see a dramatic overhaul of the justice system that delivers on maximum efficiency while stripping away unnecessary bureaucracy and instead delivering the best possible service to the customers that are in fact relied upon to keep the companies’ stock prices high. It would be nice to think so.
Police illustrations courtesy of Surian Soosay