NITI Aayog seeks clarity from govt on Digi Yatra data privacy


NITI Aayog seeks clarity from govt on Digi Yatra data privacy
Amid growing concerns about use of facial recognition technology (FRT) across sectors, the NITI Aayog has sought clarity from the government regarding possible passenger data storage, as well as compliance with the laws and trustworthiness of the ecosystem governing Digi Yatra, a flight boarding system that uses this technology and which the government proposes to use in the aviation sector in future.
The Digi Yatra Policy envisages Digi Yatra as a completely voluntary scheme. Where the passengers sign up and consent to use Digi Yatra for the purpose of check-in and boarding, this agreement would have the legal character of a voluntary agreement for the temporary collection, temporary storage, and use of data.
"This agreement must comply with existing laws and rules on data privacy," a discussion paper on "Responsible Artificial Intelligence For All" by the government think tank has suggested.
Released on November 2, the discussion paper focusses on responsible usage of artificial intelligence-based tools and technologies like FRT.
The paper goes on to add that the "rules are set out presently under the Information Technology Act, 2000, and the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data and Information) Rules, 2011 ('SPDI Rules').
"Given that the Digi Yatra Foundation, which operationalised the Digi Yatra Central Ecosystem, is established under the Companies Act, 2013, it would amount to a 'body corporate' for the purposes of the SPDI Rules. Therefore, it would be necessary for Digi Yatra to comply with the SPDI rules."
The Digi Yatra Policy further states that facial biometrics data is deleted from the local airport's database 24 hours after the departure of the passenger's flight.
NITI Aayog, however, seeks that the rules related to deletion of other information collected from the passengers, as well as any facial biometrics that are stored in other registries, must be clearly set out in Digi Yatra's policy.
The Digi Yatra Policy mentions that users may also be able to provide consent for value-added services at the airport, for which purpose, their data may be shared with other entities like cab operators and other commercial entities.
On this, the government think tank has suggested that there must be specific care taken to ensure that such consent is meaningfully provided and is not bundled by default.
"In addition to cyber security audits, it is imperative to establish a mechanism for performing algorithmic audits by independent and accredited auditors, prior to system deployment at periodic intervals," the discussion paper said.
It is pertinent to mention that FRT, like other intelligent algorithms, is fundamentally a data-intensive technology. In order to ensure propriety and legality in the manner in which data processing happens to train and develop FRT systems, it is imperative to have a codified data protection regime in the country at the earliest, the discussion paper further stated.
"The new data protection bill must retain the framework to ensure data protection, including obligations, enforcement mechanisms, a regulatory agency, penalties, and remedies from the Personal Data Protection Bill, 2019.
"Furthermore, such a regime must not be limited to regulating data processing by private entities but must adequately codify protections for fundamental right to privacy against state agencies (including law enforcement)," it added.
NITI Aayog further recommended that sensitive personal data should be protected under the new data protection law, including biometric data such as facial images and scans.
"Consequently, it is recommended that rigorous standards for data processing, as well as the storage and retention of sensitive biometric data should be adequately addressed in any proposed data protection regime, to address privacy risks associated with FRT systems," it concluded.
Source: IANS