Facial Recognition Technology (FRT)

Facial Recognition Technology (FRT)

Syllabus
GS Paper II – Important Aspects of Governance, Transparency and Accountability, E-governance- applications, models, successes, limitations, and potential;
GS Paper III – Awareness in the fields of IT, Space, Computers, Robotics, Nano-technology, Bio-technology and issues relating to Intellectual Property Rights.

Context
NITI Aayog releases White Paper: Responsible AI for All (RAI) on Facial Recognition Technology which paper examines FRT as the first use case under NITI Aayog’s RAI principles and aims to establish a framework for responsible and safe development and deployment of FRT within India.


Facial recognition technology (FRT)is a sophisticated system that analyzes facial features to identify individuals. By scanning a person’s face and comparing it to a database of stored images, FRT confirms or verifies their identity. Notably, the Delhi Police has set a threshold of 80% similarity for positive matches, while anything below that requires additional evidence. Furthermore, the data used by the Delhi Police includes photographs collected under the Identification of Prisoners Act, 1920, which has now been replaced by the Criminal Procedure (Identification) Act, 2022. FRT finds applications in security, authentication, and various other domains.

  • Facial Detection:
    • Algorithms detect the presence of human faces in images or videos.
    • Mathematical representations identify distinctive facial features.
    • Facial recognition cross-references these features with a pre-existing database.
  • Working of FRT:
    • FRT maps, analyzes, and confirms identities from photographs or videos.
    • Computer-generated filters transform images into numerical expressions for comparison.
    • Key parameters include the distance between eyes and forehead-to-chin measurements.
  • Automated Facial Recognition System (AFRS):
    • A large database stores photos and videos of people’s faces.
    • AFRS matches and identifies individuals by comparing unidentified images (e.g., CCTV footage) with the database.
  • Applications:
    • Security Related:
      • Law enforcement (surveillance, identifying persons of interest, crowd monitoring).
    • Non-Security Related:
      • Contactless onboarding at airports (e.g., Digi Yatra).
      • Unique IDs in educational institutions.
      • Authentication for accessing products, services, and public benefits.
  • FRT for 1: n Identification:
    • Law enforcement agencies, including the Delhi Police, commonly use FRT for one-to-many identification.
  • 80% Threshold:
    • The choice of an 80% similarity threshold remains unclear.
    • Categorizing results below 80% as false positives indicates further investigation by the Delhi Police.
  • Risk of Familial Targeting:
    • People sharing familial facial features (e.g., extended families or communities) could be wrongly targeted.
    • This may disproportionately affect historically overpoliced communities.
  • Concerns about Data Collection:
    • The Criminal Procedure (Identification) Act, 2022, raises fears of overbroad data collection.
    • Compliance with international best practices for data processing is essential.
  • Function Creep:
    • FRT’s expanded use beyond its original purpose exemplifies “function creep.”
    • Delhi Police employed FRT during specific incidents, such as the 2020 northeast Delhi riots and the 2021 Red Fort violence.
  • Design-Based Risks:
    • Automation Bias, discrimination, and lack of accountability.
    • Misidentification or inaccuracy due to underrepresentation in databases.
  • Rights-Based Issues:
    • Concerns about Privacy and lack of consent.
    • Preservation of Informational Autonomy and handling sensitive personal data.
  • Inaccuracy & Misuse:
    • Risks related to Misidentification due to technology inaccuracies.
    • Potential for Mass Surveillance if FRT is misused.
  • Race & Gender Disparities:
    • Accuracy rates varying significantly based on race and gender.
    • False positives (misidentifications) and false negatives (verification failures) can occur.
  • Exclusion:
    • False negatives may exclude individuals from accessing essential services.
    • For instance, Aadhaar-based biometric authentication has led to exclusion and even starvation deaths.
  • Privacy Violations: Balancing privacy concerns with FRT’s objectives remains challenging.
  • Reliability & Authenticity: Data collected may be used in court, emphasizing reliability and adherence to standards.
  • Data Protection Gap: Absence of data protection laws leaves FRT systems lacking necessary safeguards.
  • Privacy and Security:
    • Establish a data protection regime based on legality, reasonability, and proportionality.
    • Ensure privacy safeguards for FRT systems.
  • Accountability:
    • Address transparency, algorithmic accountability, and biases.
    • Hold responsible parties accountable for FRT deployment.
  • Safety and Reliability:
    • Publish standards for FRT, focusing on explainability, bias mitigation, and error handling.
    • Prioritize safety and reliability in system design.
  • Human Values:
    • Form an ethical committee to assess implications and oversee mitigation measures.
    • Reinforce positive human values in FRT development.
  • Informed Consent: Ensure individuals’ consent before using FRT.
  • Regular Audits: Conduct periodic audits to evaluate FRT’s compliance with principles.

Facial recognition technology (FRT) has matured into a powerful tool for identification and identity verification. While some applications offer convenience, efficiency, or enhanced safety, others—already deployed in the United States—raise significant equity, privacy, and civil liberties concerns. These concerns stem from both poor performance (such as unacceptable false positive or false negative rates) and problematic use or misuse of the technology. Addressing these issues requires a multifaceted approach, including improving the technology, refining procedures, and considering regulation. FRT remains a double-edged sword, balancing innovation benefits with ethical and legal implications.

Source: NITI Aayog


E-governance is not only about utilization of the power of new technology, but also much about critical importance of the ‘use value’ of information. Explain. [UPSC Civil Services Exam – 2018 Mains]


The design and deployment of facial recognition technology (FRT) by law enforcement agencies in India has been a subject of some deliberation. In this regard, discuss facial recognition technology. What are the concerns related to the utilization of facial recognition technology in India? Illustrate. [250 words]


Leave a Reply

Your email address will not be published. Required fields are marked *