HomePostsDigital RightsThe new lantern laws

Featured Artist

Related Posts

The new lantern laws

The mainstream bias of facial recognition technology

Reading Time: 5 minutes
Print Friendly, PDF & Email

In the 18th century, lantern laws in New York City demanded that Black, mixed-race, and Indigenous enslaved people carry candle lanterns with them if they walked around the city after sunset and without the company of a white person.1Claudia Garcia-Rojas, ‘The Surveillance of Blackness: From the Trans-Atlantic Slave Trade to Contemporary Surveillance Technologies’ (Truthout, 3 March 2016). Over the last few years, there has been a resurgence of this colonialist practice where minority communities are constantly being targeted and tracked through the use of technology, rather than a lantern.

The Entry-Exit System: smarter and more effective?

The deployment of Facial Recognition Technology (FRT) within a range of different technologies -both public and private-oriented- is becoming feasible due to technological development and breakthroughs. Individuals tend to identify themselves through the use of FRT on their smartphones. In addition, governments and law enforcement authorities have massively implemented FRT as a form of public surveillance (whether sufficiently ethical or not). This has been true not only for autocratic regimes such as the Chinese, which is targeting and tracking the movements of minorities, but also in the West, i.e., the United States and the European Union where several law enforcement authorities have introduced this technology into their day-to-day tasks.

The same applies for the European Union’s strategy to materialise its ‘Smart Borders Package’. A set of measures have been put together to facilitate the control and management of the external borders of the Union, namely through the implementation of the Entry/Exit System (EES System) since September 2022 and the European Travel Information and Authorisation System (ETIAS System) starting from May 2023. However, under the cover of the fight against irregular migration, these systems aim to create one of the largest databases within the Union storing biometric data on third-country nationals.

According to the EES System’s Inception Impact Assessment, the total number of border crossings in 2025 is forecasted at 887 million of which one third are expected to be third-country nationals. Thus, around 76 million third-country nationals will cross the external borders of the Union per year by 2025.

For the specific case of the EES System, the deployment of FRT will be effective within the 1800 Member States’ external border crossing points of the Schengen Area where facial images will be captured of every third-country national who crosses the border. This does not mean that a static image will be stored as a record within a larger database. Facial Recognition Technology works in four different steps. First, it captures the facial characteristics of each individual through a state-of-the-art live camera. Then, out of that image, the technology extracts the unique facial data of the individual, which is the record stored in the database alongside the third-country national’s personal data, i.e., name, date of birth or nationality.

Stemming from the existing record of an individual, the algorithm running on the Facial Recognition Technology can compare those unique facial characteristics to other existing databases. However, access to the database will be restricted to the particular objectives provided by law. For the particular case of the EES System, law enforcement authorities and control and border management authorities can access the records stored within it for two particular reasons: i) the facilitation, effectiveness, and speediness of border management and control; and ii) the investigation, prosecution and conviction of terrorist and serious criminal offences committed within the Union.

eu-LISA2Stands for European Agency for the operational management of large-scale IT systems in the area of freedom, security and justice. will operate the EES computerised central database of biometric and alphanumeric data, whereas Member States will access their National Uniform Interface from their own national border infrastructure to access the EES records. In fact, Member State border control authorities will be at the forefront of the EES System’s implementation. In addition, the EES central database is mandated to be interoperable with other similar systems, such as the central Visa Information System.

On the basis of lawful entry and access to the database, results will be produced by the algorithm running on FRT. However, this technology does not produce definite results: it works on the basis of probabilities. For instance, if a law enforcement officer looks to match an image obtained from CCTV to the existing Entry-Exit database on third-country nationals, the machine-learning algorithm will produce a hit or not match those two identities. However, that result does not mean those identities, in reality, are one and the same. Instead, it implies there is a given likelihood that those individuals do match, but there is also a possibility that the characteristics do not match at all and belong to different people altogether. These imprecise results will be the basis of decisions involving border crossings (and a possible entry refusal). In sum, those decisions can be based on inaccurate results based on statistical probabilities. In this sense, the materialisation of those errors is identified as a false positive, i.e., misidentifying a person due to the fact that the algorithm goes beyond its actual content.

The human bias incorporated into Facial Recognition Technology

Aside from its embedded characteristics, FRT is not yet a fully-fledged technology. Although we tend to think the introduction of technology will always incorporate an automated component to decision-making, the contrary seems to be true in the case of FRT. This technology is built and trained by humans. Thus, human bias is innately (and maybe unwillingly) integrated into the algorithm, which works to produce results through FRT. For instance, if the engineer designing the algorithm trains it with more facial characteristics of males, the algorithm will use their phenotypes as a blueprint for its functioning.  

Research conducted by the National Institute of Standards and Technology informed on the performance of different algorithms across the World and showed that even the most developed and advanced Facial Recognition Technology comes foul when it comes to identifying phenotypes other than those predominantly present within their training databases. In the case of those algorithms developed within the United States and Europe, white males are overrepresented within the training databases which serve to test out their matching capabilities with other images. Therefore, when it comes to identifying individuals from a different race or gender identity, FRT tends to be much more imprecise and produces higher false-positive rates for women, Black and LGTBQ+ communities. The same applies to algorithms developed in Asia, insofar as Asian males are overrepresented within their training databases and especially discriminate against minorities, as well as in relation to the previous communities.

If we bear the human bias embedded within FRT algorithms, we can also fathom the real and cognisable impact that will be produced by high numbers to the detriment of third-country nationals in terms of their intrinsic fundamental rights, namely with relation to the right to non-discrimination enshrined in article 21 of the Charter. As much as EU citizens have the right not to suffer discrimination on the basis of their race, colour, or sex, the same should be applicable to third-country nationals. However, when the EES System enters the scene, the technical flaws of FRT directly influence the outcome of the results produced through its implementation.

On top of that, I consider the European Commission has come short of offering a real solution to counteract the deficiencies of algorithms. In fact, it has only published the specifications to ensure that FRT is correctly implemented. For the particular case of false positives, it has imposed obligations on Member States to ensure that only a limited rate of misidentifications is admissible (0.1 per cent). However, if we bring these percentages to the EES System’s real dimension (295 million people affected), that means that the European Commission sanctions that 295,000 third-country nationals are misidentified, both for law enforcement and border control tasks. On top of that, these specifications do not provide further guidance to Member State authorities acting on external crossing border points, although they may be heavily influenced by equivocal results produced by FRT. Not one provision is dedicated to instructing border authorities against the automatic translation of these results to refusal to entry decisions, irrespective of the Court of Justice’s position in Schwarz (para. 44). Therefore, in my opinion, the EES System gives leeway to an unresolved interference between the fundamental right not to suffer discrimination and the right to liberty and security provided by its legal basis. All in all, the exterior borders of the Union will not sufficiently be secured if, by doing that, serious concerns on the exercise of the rest of fundamental rights are not guaranteed for third-country nationals, according to the mandatory applicable legal test under article 52(1) of the Charter. We cannot strive to gain a smarter border system if it comes at the cost of decimating fundamental rights by imposing a segmentary and fragmented understanding of Facial Recognition Technology on the officials managing its results and its drawbacks.

Suggested citation

Alba Ribera Martínez, ‘The new lantern laws: the mainstream bias of facial recognition technology’ (The Digital Constitutionalist, 07 July 2022) <https://digi-con.org/the-new-lantern-laws>

Alba Ribera Martínez
PhD Student at University Carlos III of Madrid
[citationic]

Featured Artist