Mitigating bias in identity verification

-

As digital identity verification (IDV) platforms are increasingly adopted across the African continent, any potential bias these platforms have needs to be considered.

“Biometric identity continues to have a powerful impact on our continent, from banking to public services and electoral systems,” says Gur Geva, CEO and founder of iiDENTIFii. “For this reason, it is critical that we build technology that is unbiased, accessible and reflective of Africa’s diversity.”

AI algorithms have illustrated bias in the past

Historically, international biometric authentication and face biometrics have carried bias in who they successfully identify. In 2018, an MIT study (https://apo-opa.info/3M2aexK) found that three commercial facial analysis programmes had a margin of error between 20% and 34% when identifying dark skinned women, compared to 0.8% or lower for light-skinned men. The same study indicated the root cause for this – the facial data used to train at least one of the systems was more than 77% white and more than 83% male. Another study by the National Institute of Standards and Technology (NIST) found that facial recognition algorithms falsely identified African American and Asian faces 10 to 100 times more than Caucasian faces.

How technology bias works

Artificial Intelligence (AI) technology can’t be innately biased against a particular race or gender. It is, however, a reflection of its programming and the data it has been trained on. Face biometrics will be less able to identify the patterns found in a particular demographic’s faces if the initial data sets it was trained on did not contain a diverse range of faces from a wide variety of demographics. Simply put, face biometrics can’t pick up a pattern that it hasn’t seen before.

Geva adds, “Every person has a right to identity and to be positively and easily identified. IDV systems need to ensure that the datasets used to train algorithms are balanced according to age, gender, and skin tone. With this in mind, our algorithm has been trained on over 50 million African faces to make it relevant to our continent.”

Inclusion extends to accessibility

It’s not enough to cater for a wider variety of faces. Biometrics needs to consider wider issues of accessibility in the way that identification technology is designed. For example, earlier biometric models required gestures to ensure that the person behind the screen was a live person, which proved prohibitive for some people to execute, especially the elderly.

A recent iProov (www.iProov.com) whitepaper on Unlocking Financial Inclusion (https://apo-opa.info/3ZVZAOK) states, “Video call verification, whereby the user verifies their identity via a one-to-one call with a trained operator, is liable to exclude people with language barriers or cognitive impairments. These individuals may be unwilling to engage in a two-way conversation with a stranger.”

The whitepaper reiterates that biometric face verification is a marked improvement on traditional authentication technologies, such as passwords, one time pins (OTPs)  and CAPTCHA.

Advancements in face authentication technology now makes the verification process more seamless and accessible, as people can use their face alone to verify their identity online.

How device limitation impacts bias

There is also the matter of technology bias. Someone with a lower quality camera may have a more difficult experience verifying their identity than a person with a more sophisticated device. Technology must perform well for both groups or it risks being biased. “As banks, governments and other organisations consider identity verification solutions, it is important that they check that vendors can demonstrate robust bias mitigation and compliance with WCAG (https://apo-opa.info/3QknKzs) or other accessibility standards,” says Geva. 

Technology has the power to unlock inclusive growth in Africa

Used correctly, identity verification will enable more Africans to access essential services. For governments, digital IDs offer reduced administrative burden and economic growth. For financial institutions, they can facilitate Know Your Customer (KYC) processes and enable access to the unbanked.

Historically, proof of identity was only available to those who could fulfil a rigid set of criteria. One of the main barriers to a person opening a bank account, for example, would be the inability to prove their identity without any formal identity document or proof of formal address. According to the World Bank, 57% of Africans still have no bank account, including mobile money accounts. A recent study by BPC and Fincog found that this translates to about 360 million adults in the region and approximately 17% of the total global unbanked population without access to formal financial services. According to McKinsey, extending full digital identity coverage could unlock economic value equivalent to 3-13% of GDP by 2030.

Geva says, “Remote face authentication is a crucial step in bridging the digital divide in Africa. If we can be mindful of bias and champion solutions that are accessible and inclusive regardless of technology, ability or location, we will be able to ensure everyone realises their right to identity.”

Distributed by APO Group on behalf of iiDENTIFii.

About iiDENTIFii:
iiDENTIFii is an award-winning face authentication and identity verification platform that distinguishes itself through its use of 3D and 4D Liveness® detection. Purpose-built for enterprises across Africa and the Middle East, iiDENTIFii enables frictionless, scalable customer onboarding in seconds from anywhere and on any device. Founded in 2018, iiDENTIFii has become a proven key partner in multiple tier 1 African banks. The technology plugs seamlessly into existing infrastructures, including mobile and web-based platforms. www.iiDENTIFii.com