Facial Recognition and Ethics in AI.
Facial recognition is a controversial topic. It’s a problematic technology that many didn’t realise was widely used until it started hitting the headlines in the past couple of months.
One such headline was that San Francisco had become the first city to ban facial recognition, meaning that it would not be allowed to be used by law enforcement of public transport. This decision was made due to the perceived infringement on people’s privacy, but also the unreliability of the technology, as there are cases where it hasn’t worked well for people with darker skin or for women.
When I first used facial recognition several years ago, it was in its infancy. Companies were attempting to label people’s gender automatically to better target marketing to them — and it was more often wrong than right. Because of this, I’ve always been rather sceptical about how effective this technology would be without a very large amount of diverse training data.
Axon, a US company that manufactures body cameras, announced last month that it was banning facial recognition from all of its products. This ban actually came from the company’s ethics board, who determined that it shouldn’t ethically be used until it can perform equally as well across genders and ethnicities, which it currently isn’t technically capable of doing. Axon is hoping to inspire other companies to take a stand against this emerging technology for the sake of their consumers’ privacy.
Compare this ethical stand against China’s shocking use of facial recognition on state surveillance cameras to track and control Uighurs — a largely Muslim minority group — claiming that it’s for law enforcement purposes. Considering that most of the conversations around AI in Europe and the US concentrate on how to remove bias from AI to prevent discrimination, this is a shocking example of how AI can be used to automate racism.
The Metropolitan Police has come under fire for using facial recognition on the streets of London to identify people on the most wanted list. In several instances, people who covered their faces to protect their privacy were stopped by the police, had their photo taken and were fined £90 for disorderly behaviour. Protesters in Hong Kong have been recorded using lasers to prevent police from using facial recognition to identify them. So, there are clearly people who, for a range of reasons, do not feel comfortable with this technology being used.
It begs the question — how can we regulate this technology so that we, as the public, can feel comfortable with its use?
Amazon has called for federal regulations on the use of facial recognition and Microsoft has highlighted that it could be abused. So, it becomes clear that technology providers don’t believe that the regulations should be coming from them. I agree. We can’t trust such companies to look after the best interests of consumers and their privacy. There needs to be a clear ethical framework around the application of AI and, until a universal one is being adopted, the likelihood of which is questionable, we at BJSS are developing our own ethical framework for AI. We believe that it’s the responsibility of companies like ours that design and deliver AI solutions for organisations to take a step back and think about the end user. How will they interact with this technology, and how will it impact them? How will their personal data be processed and is there a value exchange that users will achieve by utilising AI technologies?
It’s only by combining user-centred design and technical innovation that we can design and deliver AI solutions that work for everyone, including consumers who have all too often been overlooked. Only then can we build up trust with users so that they understand why and how the technology is being used and can decide whether to partake in it based on the value exchange for them, not the just the benefits that the technology provider will attain.
We are currently developing our AI ethics framework to support conversations around responsible AI, if you would like to find out more then please get in touch: firstname.lastname@example.org
Originally published at https://www.bjss.com on August 9, 2019.