The conundrum when it comes to health data is between the public benefit aspect of aggregated, anonymised health data, and individual privacy. Kazim Rizvi, Founder, The Dialogue, a tech policy think tank based out of Delhi, and Karthik Venkatesh, Research Coordinator, The Dialogue therefore opine that the data protection law must be passed without delay, to establish a strong foundation of rights for empowering individuals with regard to their personal data and make some suggestions on the way forward
The COVID-19 pandemic has led to a datafication approach to healthcare in more ways than one. Be it in terms of the dashboards to identify trends at the state level or the use of contact tracing apps. The wheels of digital healthcare and the data-centric approach to medical practice were set in motion prior to the pandemic. Now, there is an increased reliance on predictive medicine, digital healthcare, and open-sourcing anonymised data sets for developing solutions for public health issues plaguing society largely. Underlying all of these innovations and developments, sharing data – either in de-identified form or anonymised form, is crucial.
Digital healthcare in India is fraught with issues surrounding security and privacy which have implications for an individual’s personal autonomy, dignity, and even access to work. The risks associated with digital health data come in the form of illegal actors, aiming to gain access; or by negligent/unauthorised exposure due to poor data management practices. At an organisation level, technical and governance standards for handling data aim at reducing the risks associated with improper and/or negligent handling of data. There is a need for privacy-enhancing mechanisms to ensure accountability and transparency in dealings about health data.
When it comes to the legal and regulatory landscape, the IT act along with the Sensitive Personal Data and Information rules (“Rules”) are binding on all entities that are dealing with health data. Under the proposed Personal Data Protection (PDP) Bill (“The Bill”), Health data (electronic and/or otherwise) is considered as sensitive personal data. The Bill imposes binding obligations on data fiduciaries towards the data principals regarding the handling of personal data, imposes consent obligations, and penalties for breaches/leakages. However, the provisions of the Bill are applicable only on health data that is personally identifiable to the individual to whom it pertains to. In an anonymised form/ de-identified form, the data falls outside the ambit of application of the Bill, and consequently the obligations thereunder. In other words, the technical process involving deidentification/anonymisation causes differences in levels of protection to an individual’s data under the proposed regulatory scheme. Standard setting is crucial to plug these loopholes that can have adverse externalities for privacy of the individual due to poor management.
Anonymisation standards/ Deidentification of health data
Standards of anonymisation for the health sector aid in allowing data sharing while ensuring the privacy of the individuals. The conundrum when it comes to health data is between the public benefit aspect of aggregated, anonymised health data, and individual privacy. For certain purposes such as disease monitoring, research, health care innovations, open health data/ access to anonymised datasets are essential.
Drawing from other jurisdictions, there are specific legislations in the US regarding handling Health data. HIPAA specifies a “safe harbor method” and an “expert determination method” to be followed when it comes to de identifying health data. UK ICO has published a code of anonymisation, which is closely linked to the privacy legislation, which can be followed by various organisations while sharing data with third parties. Technical standards, including anonymisation processes/de identification methods and the instances where they will be relevant can standardise data practices and bring uniformity in functioning. There is growing consensus that no process of anonymisation is irreversible. There have been instances where reidentification occurs by combining various data points. In such contexts, strict access controls can aid in ensuring that privacy is maintained even while sharing of such data.
Anonymisation always deals with the tension between utility and privacy. From a business point of view, the utility of datasets reduces drastically if anonymised beyond a point. Given that these technical limitations exist, there is a need to renegotiate how to preserve health data and promote innovation/provide better service delivery.
International best practices
Various advanced jurisdictions across the world have introduced specific legislation mandating technical and governance standards for handling health data. HIPAA in the United States specifies identifiers relating to Protected Health Information that are to be removed under the safe harbor method of de identification`. This is to ensure that it mitigates privacy risks to individuals and thereby supports the secondary use of data for comparative effectiveness studies, policy assessment, life sciences research, and other endeavors.
The US and the UK consider the data security environment and the data use when deciding the degree of data de-identification required. Having said that the legislative vacuum in India creates adverse conditions for stakeholders to effectively exercise their rights of privacy and security as far as the data is concerned.
Consent models
The mandate regarding consent in the PDP Bill includes four specific attributes- informed, specific, clear and capacity to withdraw consent. In the age of big data analytics, there is a growing consensus that these standards of consent are insufficient to maximise user agency. In pursuit of establishing a more cohesive consent model, we could look at the concept of ‘graduated consent’. Under this model, the data principals can give consent to anonymisation for each type of data throughout their contract with the service provider, rather than just having a binary choice which is collected only once. Further, adding a temporal aspect to consent could be tested as a model too. For specifically identified purposes, consent can be provided for data access for a time frame. This would ensure that the consent stands for a limited time frame and that data is no longer used after that.
Way forward
As India is experiencing a wave of rapid digitisation of health records and with the proposed digital health mission, digital health data management is crucial- both offline and online. Legal and regulatory frameworks aimed at allowing individuals to better exercise their agency over their health data is crucial. The risk of exposure of health data is very real and tangible and our safe bet is stronger laws and systems. The data protection law must be passed without delay, to establish a strong foundation of rights for empowering individuals with regard to their personal data.
Towards this, a sectoral approach to data governance needs to be adopted. Particularly for sectors such as health, that fall within the ambit of sensitive data and carry a higher degree of harm to individuals due to leakage/unauthorised exposure. The health sector and digital health ecosystem have seen heightened economic activity in the recent past. To prevent the commodification of health data, a principled approach that respects individual rights has to be adopted. Industry-level codes that take into consideration technical and governance principles while allowing for voluntary sharing of anonymised health data could provide the benefits of technology in a sustained manner. Constituting a technical expert committee for the periodic audit of these techniques deployed in the healthcare sector could ensure that technological advancement is backed by safeguards that benefit individuals. Finally, a harmonised regulatory regime and clear guidance to the various stakeholders is vital to enhance public trust and prevent regulatory arbitrage.