Study Calls for Clear Rules on Passenger Info Deletion in Digi Yatra Policy

The policy for Digi Yatra, a digital platform that verifies air travellers using biometric data, needs to clearly outline rules regarding the deletion of passenger information post-travel, according to a study by NITI Aayog. Privacy concerns have been raised about user data on Digi Yatra.

Clear Data Deletion Rules for Digi Yatra

Digi Yatra uses Facial Recognition Technology (FRT) to enable contactless and seamless movement of passengers at airport checkpoints. The policy states that facial biometrics are deleted from local airport databases 24 hours after the passenger's flight departs. However, the study emphasized the need for clear rules on deleting other collected information and any facial biometrics stored in other registries.

Facial Recognition Technology and Data Retention

Digi Yatra aims to create an identity management ecosystem for Indian airports, enhancing civil aviation infrastructure, digitising manual processes, improving security standards, and reducing operational costs. The study recommended that the Digi Yatra Standard Operating Procedures (SOP) specify timelines and purposes for retaining different types of data within the Central Ecosystem, beyond which personal data should be deleted.

While the policy states that participation in Digi Yatra is voluntary, if it becomes mandatory, it must comply with the principles laid down in the K.S. Puttaswamy v. Union of India case. This case pertains to the constitutional right to privacy and outlines principles of legality, necessity, and proportionality.

Cybersecurity and Algorithmic Audits

The study suggested frequent cybersecurity audits and vulnerability testing of the Digi Yatra platform to ensure reliability, usability, and information security. It also recommended establishing a mechanism for algorithmic audits by independent auditors before system deployment and at periodic intervals.

Additionally, internal SOPs for handling personal and sensitive data must be identified. The study proposed ongoing monitoring of the system's performance and suggested that value-added services using facial recognition data should only be activated through an opt-in method of consent with the ability to revoke consent at any time.

Legal Reforms and Ethical Oversight

The study highlighted the need for policy and legal reforms to regulate facial recognition technology in India. It proposed imposing liability for harms or damages caused by FRT systems through codes of practice, industry manuals, self-regulation, or formal statutes.

Organisations deploying AI systems should establish ethical committees to assess implications and oversee mitigation measures. For FRT systems specifically, these committees should have autonomy to prescribe guidelines and ensure compliance.

The Ministry of Civil Aviation in India has initiated the Digi Yatra programme using FRT and Facial Verification Technology (FVT) at various process points to improve travel experiences. FRT refers to an AI system that identifies or verifies individuals based on images or video data interfacing with an underlying algorithm.

Despite privacy concerns, the Digi Yatra Foundation stated in April that the Digi Yatra Central Ecosystem (DYCE) does not store any Personally Identifiable Information (PII) in any central repository. The foundation serves as the nodal agency for the app.

Digi Yatra proposes using FRT to authenticate passengers' travel credentials, allowing automated operation of airport checkpoints with minimal human involvement. This technology has potential benefits for streamlining airport operations and enhancing the civil aviation ecosystem.

The study also noted that there must be provisions for ongoing monitoring of system performance. It suggested that using facial recognition data for value-added services should require opt-in consent with revocation options available at any time.

The objective is to create a comprehensive governance framework addressing challenges posed by FRT systems. This includes ensuring continuous engagement with evolving threats through adaptive measures.

More From GoodReturns

Notifications
Settings
Clear Notifications
Notifications
Use the toggle to switch on notifications
  • Block for 8 hours
  • Block for 12 hours
  • Block for 24 hours
  • Don't block
Gender
Select your Gender
  • Male
  • Female
  • Others
Age
Select your Age Range
  • Under 18
  • 18 to 25
  • 26 to 35
  • 36 to 45
  • 45 to 55
  • 55+