The scrapping of facial recognition technology at Kings Cross is a step in the right direction.

Katie Gibbs
3 min readSep 5, 2019

--

The announcement that facial recognition technology has been scrapped at Kings Cross after a public outcry is a step in the right direction for AI. This may seem like a step back considering that it’s put another barrier up when it comes to AI adoption, but it’s for the right reasons.

It clearly demonstrates that the public want to know when and how AI technology is being used and what the value exchange is. The King Cross site had been using facial recognition on its CCTV footage of pedestrians for almost a year but had not been transparent about this until a few weeks ago, although they still did not provide any detail about the software used, beyond saying in their statement that they used a “number of detection and tracking methods, including facial recognition” across the development. This announcement led to a social media backlash and prompted Sadiq Khan to write a letter to the owner of the development to express his concerns in how this technology was being used. The mayor of London has since confirmed that the Metropolitan police were involved in the deployment of facial recognition at the site, whilst Cardiff High Court has ruled that police use of facial recognition is legal, despite such mass surveillance systems interfering with the privacy rights of people scanned.

There is a lot of fear and uncertainty surrounding AI, and this development emphasises that transparency is the only way to overcome this. You need to be transparent about using AI technologies, from using it to determine who to provide a mortgage to, to personalised product recommendations. Many companies have avoided this to date. The fact that the public reaction has resulted in a change to using this technology highlights why we need to develop user-centric AI solutions. It cannot be an afterthought. The default cannot be AI for the sake of AI, because you need to work with users — or in this case subjects — in order to understand what value they get out of this in exchange, be it better identification of criminals through to personalised adverts. We should heed their feedback to win their trust.

It’s encouraging that in this situation people were alarmed by the use of the technology, both in terms of its purpose and the privacy concerns that this raises. It’s similar to the noise surrounding voice assistants the past few months where Siri, Alexa and Google Home have been recording private conversations — even where the wake work hasn’t been triggered. We’re at a turning point where consumers’ privacy concerns are driving real change in the industry.

It’s one of few examples where public outrage has altered the adoption of AI since the Cambridge Analytica scandal. It sends a clear message to organisations — it’s no longer acceptable to use AI without our knowledge. If you want to use AI to drive value for your organisation, then you need to work with end users in order to design a solution that works for them as well, otherwise you’ll spend a lot of time and money deploying a solution for it potentially be scrapped due to concerns raised from the end-users once it’s in live.

For years, companies have forgotten everything they know about User Experience when it comes to AI, so let’s hope that this is a turning point where organisations put the User Experience at the heart of the AI that they design and deploy. This is a warning to companies that they can no longer develop AI solutions in isolation, they need to open up the conversations about what their AI vision is, how it drives value for the end users and the general public as well as for them and the need for them to be transparent throughout their AI journey to build up and maintain trust. Consumers are waking up to the fact that they can affect change, and companies need to recognise this and alter their ways of working.

--

--

Katie Gibbs

Founding Partner at Emergence, delivering profound transformation with emerging tech.Passionate about diversity and inclusivity, especially women in tech.