ABSTRACT
Facial recognition technology (FRT )is increasingly being applied in both public & private sectors which ranges from law enforcement to identity of verification systems. However , integration into governance frameworks have shown systematic prejudices , which is particularly against transgender people. This research paper focuses on why algorithmic biasness in FRT leads to transgender people being misidentified, excluded & legally invisible. Drawing on technical insights, legal advances, & international human rights norms.
This paper explores how data sets trained primarily on cisgender faces fail to reliably distinguish transgender people , particularly those in or after transition.
This paper critically investigates how FRT disproportionately misidentifies transgender & gender non -conforming people, particularly those who are undergoing or have undergone gender transition. These mistakes are due to binary gendered assumptions embedded in training data sets & computational models which do not represent varied gender expressions & identities.
At the same time , many legal frameworks continue to set stringent & often demanding criteria for recognizing gender identity, keeping transgender people legally invisible in systems that require state -verified identities.
This study demonstrates how the use of facial recognition technologies without thoughtful planning & legal reform can promote systemic bias by linking technological failures of FRT To the socio-legal marginalization of transgender persons.
It also refers to international human rights documents such as the Yogyakarta principles & the ICCPR To describe these issues as abuses of fundamental rights to privacy, dignity, equality & legal recognition.
The study also investigates the ethical & legal duties of both governments & private entities who use these technologies. It argues that algorithmic unfairness is a manifestation of deep-rooted socio-legal systems that undermine gender diversity.
The lack of inclusive policies, collaborative design processes , and regulatory monitoring contributes to the maintenance of discrimination through digital systems.
KEYWORDS
- FACIAL RECOGNITION TECHNOLOGY (FRT)
- TRANSGENDER RIGHTS
- HUMAN RIGHTS
- LEGAL INVISIBILITY
- BIOMETRIC IDENTIFICATION
INTRODUCTION
In the digital world , Facial Recognition Technology (FRT) Has become a crucial part of monitoring, authorization , & identify verification systems which are used by both governments & private businesses. Despite its convenience & security benefits are widely acknowledged, FRT has faced a lot of criticism because of systemic algorithmic biases, which are notably against marginalized & disadvantaged groups. Transgender people are the most affected , as they are often mistaken , excluded, or rendered invisible in digital systems based on cisnormative biometric datasets.
Numerous studies have found that FRT has large accuracy differences across race, gender, & age, with transgender & gender nonconfirming people becoming the most negatively impacted.
These algorithms frequently function on binary gender concepts and rely on datasets that primarily represent cisgender male , & lighter -skinned individuals.
As a result, transgender people-particularly those in varying phases of medical or social transition are commonly misidentified, misgenered , or completely excluded from systems that rely on facial biometric authentication.
The consequences are far reaching, which includes unlawful detentions, public humiliation, inability to access critical services , & further marginalization of an already vulnerable community.
this dual burden of technological invisibility & legal erasure creates a dangerous feedback loop.
Transgender people are often misidentified by technology because their legal identities do not match their lived experiences, & often they are legally invisible because technological systems continue ineffective or false gender data.
These challenges are more than just technical defects but there are major abuses of core human rights such as privacy , dignity , equality & legal respect.
This research investigates FRT not just as a technological instrument, but also as a social control mechanism capable of institutionalizing homosexuality and gender based discrimination in digital form.
LITERATURE REVIEW
The combination of facial recognition technology (FRT) & Transgender rights is gaining scholarly attention , while most of the material is still emerging & is incomplete. Existing research highlights on systematic challenges which includes algorithmic bias in machine learning systems, the socio-legal elimination of transgender identities, & the human rights implications of biometric monitoring. This review creates major contributions from various disciplines to lay the groundwork for the current study.
ALGORITHMIC BIAS & FACIAL RECOGNITION TECHNOLOGY
In seminal study in the year 2018 given by Buolamwini & Gebru Gender shades, they found that commercial facial recognition systems have large accuracy variations across race & gender. Their findings revealed that darker -skinned women had the highest mistake rates, whereas on the other hand , lighter skinned men were identified more appropriately. While this research did not primarily focused on transgender people but it laid the framework for subsequent assessments of gender irregularity & machine discrimination.
Raji et. Al (2020) noted the lack of regulatory control in facial recognition systems, pointing to ‘algorithmic auditing’ as a required mechanism for detecting & correcting hidden discriminatory patterns. Their findings highlighted the broader consequences of biased algorithms in fostering existing social structures.
TRANSGENDER EXPERIENCES WITH BIOMETRIC SURVEILLANCE
Keyes (2018) in the ‘MISGENDERING MACHINES’ Was one of the first to directly attack automatic gender recognition systems from a transgender studies perspective.
According to Keyes, gender classification systems rely on out -of date binary conventions, thereby making non-binary or changing individuals invisible.
This work reveals about the basic epistemic errors which are there in gender – recognition algorithms & highlights the harm caused by technology misconceptions about the gender.
In the Indian context, decisions such as NALSA VS UNION OF INDIA IN THE YEAR (2014). Created a constitutional framework for the acceptance of transgender rights by supporting the right to identify one’s own gender.
However experts such as Arvind Narrain & Jayna kothari argue that the application of these rights is scattered & shallow , particularly in digital governance frameworks like Aadhar.
Transgender people continue to struggle with updating their identity documents , finding it difficult to obtain services that need biometric or facial verification.
METHOD
The method used in this research is doctrinal & analytical. The legal framework is examined using statutes, constitutional provisions, and case law. The article also conducts comparative legal analysis to better understand various methods to gender identity recognition & FRT regulation. Although no fieldwork or primary statistical information were gathered , qualitative insights from documented experiences & existing studies supplemented the study. This permits the study to objectively assess both the legal gap & technology restrictions that contribute to the marginalization of transgender people. The research also employs a comparative method by analyzing how other jurisdictions such as United states, the European Union , Argentina & Malta it states that how these countries regulate facial recognition systems & how they legally recognize gender diversity. This comparative lens allows the study to identity gaps in the Indian legal framework & to draw upon international best practices.
By combining doctrinal research with interdisciplinary evaluation and comparative analysis, the study provides a thorough & justice oriented assessment of How FRT contributes to transgender people’s digital & legal invisibility.
The research combined methodology , which includes doctrinal analysis, comparative legal study , interdisciplinary literature engagement & normative evaluation, which aims to provide a comprehensive , rights based & policy relevant critique of How FRT Contributes to the legal invisibility & marginalization of transgender people in India & elsewhere.
HYPOTHESIS
Facial recognition technology as it is currently designed & implemented, disproportionately misidentifies transgender people due to algorithmic bias & lack of inclusive datasets. This technological discrimination is worsened due to the legal invisibility of transgender identities, resulting in systemic violations of their fundamental human rights.
This hypothesis focuses on :
- The technological failure
- The legal failure
- The resulting human rights consequences.
Because of its reliance on binary gender classification systems & non -representative datasets, facial recognition technology (FRT) Is fundamentally biased against transgender & gender non-conforming people. These biases lead to increased instances of misidentification, exclusion & misgendering of transgender people.
OBJECTIVES OF THE STUDY
- To conduct a critical analysis of the prevalence & impact of algorithmic bias in face recognition technology (FRT) which primarily focuses on transgender & gender nonconfirming individuals.
- To investigate how legal systems around the world, which includes India , which contributes to transgender people’s legal invisibility by failing to recognize their gender identities.
- To investigate the interaction between biometric monitoring & transgender rights in the context of international human rights legislation.
- To explore the real -world implications Of FRT for transgender people which includes misidentification, marginalization & denial of services.
- To evaluate the current legal & ethical frameworks governing FRT & identify gaps in safeguarding transgender people from technology discrimination.
RESEARCH QUESTIONS
- How does face recognition technology (FRT) transmit & reinforce bias against transgender and gender non-conforming people?
- What are the specific technology & design shortcomings in FRT Systems that cause misidentification or rejection of transgender people?
- To what extent do existing legal frameworks in India & around the world acknowledge & accept non -binary or shifting gender identities in digital identify systems?
- How does the lack of legal acknowledgement of gender identity contribute to the discriminatory consequences of FRT on transgender people
RESEARCH METHODOLOGY
This research adopts a qualitative & doctrinal legal approach, supplemented by insights from technology studies, gender theory , & international human rights debate. The study primarily relies on doctrinal legal research to examine statutes, constitutional provisions , judicial precedents , & international legal instruments concerning transgender rights & biometric surveillance. The domestic legal foundation of the analysis is formed by key Indian Legal Texts such as The Constitution of India specifically ( Articles 14, 15 . 19 & 21), the transgender persons (Protection of Rights )Act , 2019 , and landmark judgements such as NALSA VS UNION OF INDIA & NAVTEJ SINGH JOHAR VS UNION OF INDIA.
A comparative legal method is also used to investigate how various jurisdictions handle the interaction of facial recognition technology (frt ) & TRANSGENDER RIGHTS.
Countries such as the United states, Argentina, Malta & members of the European union provide useful contrasts & lessons specifically with regard to legal recognition of non-binary identities, data protection legislation , & FRT SPECIFIC RULES such as the EU’S Artificial Intelligence Act & GDPR Precautions. Using, this comparative lens , the study finds worldwide best practices & gaps that can help Indian policy reform.
The study also includes an interdisciplinary literature review, which draws scholarly articles, non governmental organization reports, & current empirical investigations on algorithm bias , digital exclusion ,transgender people’s lived experiences.
FINDINGS
The study reveals numerous important gaps & obstacles at the intersection of facial recognition technology (FRT) & Transgender rights in India, as laid out by doctrinal analysis & legal precedents :
Legal recognition did not translate into technological access
While in the case of NALSA VS UNION OF INDIA [2014 ] 5 SCC 348] Decision affirmed transgender people’s freedom to self – identity, technical systems like FRT are still coded according to binary gender norms.
This mismatch between legal recognition & algorithmic design leads to frequent misidentification or exclusion of transgender people from digital verification systems ( such as Aadhar based authentication or be it airport security).
MISALIGNMENT WITH THE RIGHT TO PRIVACY
The supreme court decision in the case of Justice K.S . PUTTASWAMY VS UNION OF INDIA [2017 ] 10 SCC 1] this case was a turning point as this case confirmed about privacy , autonomy and informational self determination are fundamental to the right to life under Article 21. However the use of FRT Without informed Consent & monitoring , especially in Public Surveillance, disproportionately disadvantages transgender people who are already under social observation. Their biometric data is taken without ensuring dignity or protection against abuse.
IS BINARY GENDER CLASSIFICATION IN FRT CONSTITUTIONALLY VALID IN INDIA?
However artificial intelligence – powered facial recognition technologies (FRT) are primarily based on binary gender classification models which detects , categories individuals as male or female. The binary gender classification model is highly questionable under constitution. As it results in exclusion of transgender and it violates Article 14 , 15 , 19(1)(a), 21 of the Indian constitution. These violates:
Article 14- Right to equality
Article 15 -Non Discrimination
Article 19(1)(a)- Freedom of Expression
Article 21 – Right to Life & Personal Liberty.
VIOLATION OF INTERNATIONAL NORMS ON DATA PROTECTION INTENTIONALLY
S & Maper vs UK [2008 ] ECHR 1581 found that indiscriminate keeping of biometric data violates Article 8 of the EUROPEAN CONVENTION ON HUMAN RIGHTS [RIGHT TO PRIVACY] . IN Comparison India lacks a specific legal framework for regulating FRT. This legal void allows transgender people’s data to be stored , examined or even criminalized without court examination or transparency.
LACK OF POLICY SAFETY BARRIERS
Despite India’s commitment to human rights & Support for the Yogyakarta principles , there is no explicit prohibition on the use of FRT that discriminates based on gender identity. The lack of trans rights standards In Tech design , the absence of regulating agencies for AI ethics , & the inadequate execution of the transgender persons (protection of Rights )Act , 2019 all contribute to the invisibility of transgender people In digital spaces.
THREAT TO BODILY & INFORMATIONAL AUTONOMY
When used in state – sponsored surveillance or welfare schemes, facial recognition systems might compel transgender people into selecting gender categories that do not match their identity which limits bodily autonomy. This contradicts the spirit of NALSA & PUTTASWAMY Both of which recognize an individual’s right to identification & autonomy.
SUGGESTIONS
Based on the study doctrinal , comparative & analytical findings that suggests several key recommendations should be made to ensure that Facial Recognition Technology [FRT] IS developed & is researched & used in a way that respects transgender people’s constitutional & human rights :
Inclusive technology design standards
To ensure that gender – diverse individuals are recognized & appropriately identified , FRT developers & implementers must adhere to obligatory regulatory guidelines. This consists of :
Using non binary data sets in the training phase of facial recognition systems.
Consulting with transgender & LGBTQ populations in the design , development , & application of biometric tools.
Adoption of comprehensive data protection legislation
India must adopt a strong data protection law that :
Recognizes gender identification & biometric data as sensitive personal information
Prohibits the acquisition ,storage or use of biometric data including that of facial scans without informed , free & explicit consent.
Contains positive safeguards against algorithmic planning & surveillance of vulnerable communities , including transgender people.
MANDATORY HUMAN RIGHTS IMPACT ASSESSMENTS (HRIAS)
All governmental or private operations of facial recognition technologies should be preceded by a Human Rights Impact Assessment focusing on :
Disproportionate impacts on transgender people
Risks of being excluded, incorrectly identifying , & data misuse
Compliance with constitutional provisions & international human rights instruments such as { ICCPR , UDHR , & YOGYAKARTA PRINCIPLES}.
STRENGTHENING THE IMPLEMENTATION OF TRANSGENDER PERSONS ACT . 2019
The Transgender persons [Protection of Rights] Act , 2019 has to be reinforced by :
Ensure that government identity systems such as {Aadhar , voter ID} Are completely gender -inclusive during registration & biometric identification.
Mandatory gender sensitivity training for officials in charge of technology based identification verification systems.
Establishing specialized redressal methods for transgender people who have been excluded or mistreated by automated systems such as FRT.
JUDICIAL & LEGISLATIVE OVERSIGHT OF FACIAL RECOGNITION PROGRAMS
Courts & legislatures must :
Monitor & control the usage of FRT in law enforcement & public areas.
Recognize the right to informational autonomy guaranteed by Article 21 in new digital situations.
Direct authorities to follow the privacy by design & anti discrimination by design principles.
CREATION OF AN AI ETHICS & EQUALITY COMMISSION
A legislative or quasi -judicial AI Ethics Commission should be formed with representatives from the following :
The transgender community
Legal experts, technologists & ethical thinkers
Civil society groups
Its mission should be to audit algorithms, check biases & assure inclusivity in both private & public use of emerging technology.
PUBLIC AWARENESS & DIGITAL LITERACY INITIATIVES
Launch tailored campaigns for :
Educate transgender communities about their digital rights, privacy safeguards & relevant grievance redressal mechanisms
Raise public awareness about the dangers of algorithms bias & the value of inclusive AI.
CONCLUSION
The digital era , although offers transformative capabilities for governance & identity management , which also carries significant risks of marginalization when the technology is designed & used without inclusivity , sensitivity & accountability. This study uncovers how face recognition technology [FRT ] , when infused with binary gender norms , routinely erases , misidentifies, or excludes transgender people ,perpetuating their legal & social invisibility.
Finally technology must be a tool for empowerment & not erasure. It should serve as aims of justice & equality , facial recognition technologies must be redesigned from a rights based, intersectional , & inclusive perspective. Only then we must ensure that no identity is left behind in the world of artificial intelligence [AI].
REFERENCES
- National Legal Services Authority v. Union of India, (2014) 5 S.C.C. 438 (India).
- Justice K.S. Puttaswamy (Retd.) v. Union of India, (2017) 10 S.C.C. 1 (India).
- Navtej Singh Johar v. Union of India, (2018) 10 S.C.C. 1 (India).
- S. and Marper v. United Kingdom, App. Nos. 30562/04 and 30566/04, 48 Eur. Ct. H.R. 50 (2008).
- Joy Buolamwini & Timnit Gebru, Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, 81 Proc. Mach. Learning Res. 1 (2018), https://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf.
- Internet Freedom Foundation, Project Panoptic – Mapping the Use of Facial Recognition in India, https://internetfreedom.in/project-panoptic
- U.N. Hum. Rights. Comm., Gen. Comment No. 16, ¶¶ 1–10.
- The Yogyakarta Principles, Principles on the Application of International Human Rights Law in Relation to Sexual Orientation and Gender Identity (Mar. 2007), https://yogyakartaprinciples.org.
- European Union, Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act), COM/2021/206
- Vidushi Marda, Artificial Intelligence Policy in India: A Framework for Engaging the Limits of Data-Driven Decision Making, 13 Indian J.L. & Tech. 1 (2017).
- Keyes, The Misgendering Machines, at 88:1.
- 2021), https://privacyinternational.org/news-analysis/4755/facial-recognition-research-erases-trans-and-non-binary-people.
- Heinrich Böll Stiftung, Trans Lives Under Surveillance, HBS INDIA (Nov. 2021), https://in.boell.org/en/2021/11/30/trans-lives-under-surveillance.
- Facial Recognition Tech Struggles to Identify Transgender People, ENG’G & TECH. https://eandt.theiet.org/content/articles/2019/10/facial-recognition-tech-struggles-to-identify-transgender-people.
- Facial Recognition Technology Struggles to See Beyond Gender Binary, REUTERS
AUTHOR : PRETTY JAGDISH BHATIA
COLLEGE : DY PATIL UNIVERSITY
