BSIA’s guide to ethical AFR use explained – In conversation with the BSIA’s Ben Linklater

After the British Security Industry Association’s (BSIA) launch of its ethical and legal guide for the use of Automated Facial Recognition (AFR) earlier this year, IFSEC Global catches up with Chairman of the BSIA’s Video Surveillance Systems Section, Ben Linklater, to find out more about the purpose of the guide.

Despite being present in the industry for quite some time now, the uses of AFR technology remain extremely diverse. The BSIA’s new guide aims to cover the ethical and legal framework surrounding AFR – a complex issue, but one that affects installers, users and operators of facial recognition surveillance systems

The new publication is intended for use by system designers, installers, and end-users, and includes an assessment of the need for AFR in given situations.

IFSEC Global (IG): Ben, automated facial recognition technology has been around for some time now, but its use remains divisive. How does the BSIA’s new guide aim to support those in the private security industry?

Ben Linklater (BL): The guide was birthed out of a discussion within my segment – AI being the over-arching discussion. With all the world-wide attention AI has been receiving, and the leaps forward AFR is taking, it’s becoming more prevalent in the industry and something our members are keen to talk about.

Technical guidance has already been drafted, however, there’s nothing covering the ethical and legal framework in which this technology can be applied.

As an end-user, you are looking to use this technology as a solution, you’re not necessarily sure where to start and end. When we started this, there was a blanket ban being discussed on AFR in the UK, but the industry stepped up and provided legal and ethical framework and now a ban is completely off the table.

The point of making the guidance was to remove technical jargon and create guidance the end-user, as well as the installer, can understand.

IG: And how has the guide been received so far from the industry?

(BL): We have received some very positive feedback from across the industry. We’ve had journalists reaching out and asking lots of questions. The guidance has even reached South America, where it’s being used to create legal framework which could later become legislation.

A lot of people are saying it is the first step towards creating a natural standard around the ethical and legal implications of AFR. The guide walks you through the process of understanding, so that end-users can have more confidence in the software they use and start conversations with their solution provider.

IG: What is your opinion on AFR? Do you believe the benefits outweigh the concerns, such as data privacy?

(BL): AFR is such a broad topic, if you took AFR off your phone you would have to go back to your thumbprint and what’s more secure, your face or your thumbprint? Looking at the guidance, it’s important to make sure that what you’re using it for is fit for purpose.

From the perspective of the public, if a building is facing a bomb threat, a system could monitor exactly who is coming in and out of that building. To me, it is no different to passport control, we’re all boarding a plane and want to feel safe on that plane, what’s the difference?

The BSIA is always lobbying the government and we have worked really closely with the Biometrics Commissioner and the Surveillance Commissioner throughout the creation of this guidance. We exist to help push through the needs of the industry.

IG: The guide details that in both use cases (either for verification or identification purposes) that a human is always ‘in the loop’ – how important is the human element to all of this?

(BL): Using an example of an office front door, you’ve given it data which determines an outcome, e.g., opening the door, in this case, there is no need for human action. When it comes to the identification side, where there might be a negative impact on a person, it’s very important that the human IQ is added to the equation. Technology and humans can then work together to determine either a positive or negative outcome.

I can understand why people may be fearful, but, if we start looking at AI and AFR as a means of keeping the public safe – for example, its ability to determine someone acting suspiciously in a crowd, who could later turn out to be a potential gunman – it is an extremely important tool. The more we use this technology, the more important legal framework becomes, with great power comes great responsibility.

If Police had access to a watchlist of people stored on AFR cameras across London, could we avoid incidents? Yes.

The BSIA trust that, used as intended, AI could become a crucial tool in ensuring the safety of the public, both in the UK and on a global scale soon.

The BSIA also has a Special Interest Group (SIG) devoted to the AFR/AI topic, which developed the industry-first Automated Facial Recognition – a guide to legal and ethical use, that has been recognised as a leading piece of work in this innovative, yet controversial area of technology. The BSIA will be chairing the development of a new standard on AFR ethical use at the British Standards Institute (BSI), based on the guidance.

Source: IFSECGLOBAL

1000+ people have put their trust in G6S Security, how about you?