< Back to Blog

Freedom through technology – helping our overstretched Police

April 18, 2024 by Alistair Enser

Over the last week I have noticed a number of examples of the clash between individual privacy and technological progress, as news items have highlighted our growing reliance on tech, as well as concerns over how tech is used – and the power it has over us. For every real and credible concern regarding civil liberties, there are genuine and hugely beneficial use cases for the positive. Is 2021 the year when we take some brave steps forward?

Example one: Google claims to have completed its acquisition of Fitbit, despite the outstanding US Department of Justice investigation into the acquisition. Google has claimed that “Fitbit users’ health and wellness data won’t be used for Google ads”, despite that data providing Google with the tempting opportunity to monetize every single step that Fitbit’s 28 million users take.

Example two: my article last week mentioned Head of UK counter-terrorism policing Assistant Commissioner Neil Basu. I read this week that Tony Porter, the Surveillance Camera Commissioner, has left his post. A former Assistant Chief Constable of Greater Manchester Police, Tony was an advocate for the use of facial recognition technology by police forces, despite some cities in the USA banning its use due to alleged inaccuracy. He has left to take up the post of Chief Privacy Officer for Corsight AI, a facial recognition technology provider!

Example three: Amazon’s launch of its AWS Panorama computer vision solution has attracted criticism for inviting companies to track their employees at work. Would I want my employer analysing every last thing I do to assess my performance? No. Would I support it using analytics to support health and safety breaches like someone riding on a forklift truck in a warehouse, or smoking in a dangerous area at work? Absolutely!

Example four: Social media platforms have iced President Trump’s accounts after accusations that he incited the violence on Capitol Hill. Famously protected by Section 230 of the Communications Decency Act, these platforms are not publishers in the true sense, so don’t legally bear full responsibility for the messages they carry. But the threat of regulatory control is forcing them to moderate content and ban users, even if it makes a dent in the argument in favour of free speech. And by the way, I for one think they should stand up and be counted.

Example five: A coalition of leading technology companies, health organizations and non-profit groups — including Microsoft, Oracle and Salesforce — announced they were developing technology standards to allow consumers to prove their Covid-19 vaccination through a health passport app. As I have written before, a passport would allow much of society to return to normal, bringing confidence and safety to an uncertain situation – although some see this as a ‘gateway drug’ to compulsory ID cards.

A step too far – or a step in the right direction?

“We’ve really seen the pandemic used as a tech experiment on people’s rights and freedoms” says Ella Jakubowska at European Digital Rights, an advocacy group.

I don’t agree with this argument. While huge strides have been taken in the adoption of new technologies over the last year, these aren’t necessarily new, and the feedback from my prior surveys strongly suggests that a large number of people are ready to compromise. Often these claims ignore the very real benefits that technology brings in times of unprecedented threats, and they focus more on the negatives, which should be recognised, of course; but they should not be a reason to ignore the solution. Here are two recent examples from the news.

Last week Khairi Saadallah received a life sentence for murdering three people in a park in Reading. Police released CCTV footage of the terrorist purchasing a knife from Morrisons, leaving his apartment and subsequently then hiding his rucksack behind some bins and presumably taking his knife out before committing his terrible crimes. I don’t wish to be controversial, and I hold the brave officers who detained Saadallah in high regard, but if we allowed them to be better equipped to ‘glue’ all this data together through behavioural analytics, could the authorities have been alerted immediately to his suspicious activity? It’s a fine line in terms of privacy and surveillance, for sure, but sadly the evidence was there to be had.

Last week also saw the death of Mohamud Mohammed Hassan following his release from Police custody in Wales. While Police have shared CCTV and body worn video footage as part of the investigation, I wonder why we don’t make more use of video analytics to help protect both the Police against claims and people in custody from harm. Whether mental health issues, suicidal intent or unmet medical needs, we at Reliance High-Tech have investigated many technologies that may assist police, potentially reduce risk of harm or death, and reduce the burden and workload on an already overstretched force.

For example, speaking to a manufacturer recently, I was interested to learn that they have developed tools to help desk sergeants in custody suites query NHS databases and identify people who may have specific medical needs such as diabetes, epilepsy, asthma or mental health issues, all in an effort to preserve life and protect. While the privacy luddites seem to have an irrational fear of sharing data across public services, surely there is nothing wrong in giving police the information that will help them do their job better and keep people safer. Given the significant personal and financial cost of a death in custody, what price do we put on safety?

A tool, like any other

Many of those who fear this tech make the mistake of seeing it as wholly separated from human intervention. But analytics, the AI that underpins it, and the algorithms that ‘fire its neurons’ are not a replacement for human judgement and decision-making, and nor should they be.

Analytics should be used to augment what we are doing – to be a second set of eyes, not to replace them entirely. If you blindly follow facial recognition analysis without conducting further enquiries into the results thrown up you can of course expect to arrest the wrong person. But as we saw with the Salisbury nerve agent attacks, if Police can mine a larger cohort of data using technology, they can highlight more potential matches, more efficiently and faster, which can then be filtered using human investigation and put through a proper offline process to get results.

Returning once again to those objections to a digital vaccine passport, we all value our freedoms. But our freedoms are based upon trust of everyone else’s freedoms, and the responsibility that we all have to each other and everyone else. During a global pandemic is it really unreasonable that efforts to control it should involve sharing data. and carrying a vaccination passport?

Last week I wrote that I was surprised that analytics hadn’t accelerated as much as I expected. Maybe 2021 is the year when that changes.