Centre for Internet & Society
Privacy laws: Alternatives to consent

The accountability model will have the greatest impact on companies that deal with personal data, increasing their obligations to ensure that their actions do not, even inadvertently, result in breach of the privacy of their subscribers. Photo: iStock

As changes in technology have made it near impossible to obtain informed consent, the solution may lie in an accountability-based standard for privacy protection.

Privacy laws: Alternatives to consent

The accountability model will have the greatest impact on companies that deal with personal data, increasing their obligations to ensure that their actions do not, even inadvertently, result in breach of the privacy of their subscribers. Photo: iStock

The article was published in the Livemint on August 11, 2017. Pranesh Prakash was quoted.


On 1 August, the government set in motion the process of drafting a new data protection law by setting up a panel under the guidance of former Supreme Court judge B.N. Srikrishna. The panel has been asked to suggest the principles to be considered while framing a data protection law. Most lawmakers around the world resort to consent as the default model to protect personal privacy. But is consent really the best and only way to provide meaningful control and to protect the individual?

In an earlier article in this series we discussed the various reasons why consent is no longer the best way to protect personal privacy. Today, traditional point-to-point transfers of data have been replaced with data flows through distributed systems, making it difficult for individuals to know which organizations are processing their data and for what purposes. This context makes it impossible to obtain valid individual consent. Machine learning systems do not need explicit programming and can teach themselves from mountains of data. This makes consent particularly inappropriate, as given the fraud prevention purposes for which these tools are used, seeking consent would prejudice the very purpose of processing.

Europe’s new General Data Protection Regulation (GDPR), which will come into force in May 2018, seems to suggest that accountability will become the new basis for compliance. According to experts, the transition period until the new rules come into force will be all about getting data controllers to adopt accountability measures to ensure greater security and trust around processing.

The new rules “advocate a risk-based approach with the data subject at its centre, so controllers will need to assess any risks to individuals posed by their processing activities and what measures they need to take to address them. The requirements also identify common factors for controllers to take into account when making those assessments, like the state of the art, the cost of implementation and the nature, scope and purposes of data processing,” according to a paper by Irish law firm Matheson.

In India, a paper by Rahul Matthan, a fellow at the Bengaluru-based policy think tank Takshashila Institute, bats for the adoption of a similar model that would hold data controllers and processors accountable for any harm caused to data subjects, irrespective of the consent they may have obtained. Instead of requiring data controllers to obtain consent for the collection and subsequent use of personal data, Matthan suggests the implementation of a rights-based model for data privacy that will impute a set of data rights for everyone rather than look to specific terms and conditions that they have entered into with each site they sign up to.

The accountability model will have the greatest impact on companies that deal with personal data, increasing their obligations to ensure that their actions do not, even inadvertently, result in breach of the privacy of their subscribers. What do these firms think about a new model where privacy is not based so much on the specific policies that their users agree to, but on a much broader obligation to be accountable for their actions?

“When we think about new products, we design them from the ground up with privacy in mind,” a Facebook Inc. spokesperson said in an emailed response. “We complete thorough privacy reviews of our products so that innovation does not come at the expense of choice and control. We integrate tools people can use to control their information and make personal privacy choices.”

A Twitter Inc. spokesperson did not directly address the question of accountability but pointed to its updated privacy policy, new privacy tools and past efforts in advocacy of privacy.

In general, corporations are likely to find accountability to be an easy standard to comply with. Most already adhere to this higher standard of care as, regardless of the specific terms of their privacy policies, the public relations fallout that would result from a privacy breach due to their negligence will have a huge impact on subscriber confidence in their services.

To that extent, most companies already think of themselves as being responsible for the personal privacy of their users above and beyond the specific terms and conditions of their privacy policy.

But can accountability totally replace consent? Opinions are divided.

“Substituting accountability for consent is neither simple nor easy,” said Pranesh Prakash, policy director at the Centre for Internet and Society, a Bengaluru-based think tank. “With current consent models, one doesn’t necessarily need to prove specific harm, whereas accountability models might require it, and that would be difficult, and especially impossible given the current state of courts.”

Secondly, while a rights/fiduciary model brings flexibility for data controllers data users, it comes at the cost of uncertainty, he argued.

“Consent brings in some amount of inflexibility but with the benefit of certainty,” he said. “If we move to a rights and fiduciary duty model, that would mean the entity using your data cannot do anything against your best interests, just as your accountant, or your doctor, or your lawyer owe you a high standard of care. But with that increased duty, there comes the added flexibility in terms of using data anonymously, in a way that doesn’t cause much harm while providing benefits.”

“I agree that consent, in theory, provides greater certainty,” counters Matthan, “However, it is questionable whether we can actually benefit from that certainty. In today’s context, it is impossible to obtain truly informed consent. We must, therefore, find an alternative mechanism to protect the privacy of our citizens. Accountability shifts the responsibility of determining whether or not a particular use of data will harm an individual away from that person, who has little or no ability to accurately decide that for himself, to the data controller, who has a far greater ability to do so.”

Others, such as advocate and cyber law expert N.S. Nappinai, say that it should not be a question of either/or and that both consent and accountability are needed for a robust data protection law.

“A huge loophole in the laws across the world, including the very robust GDPR, which will come into effect in 2018, is the sharing of third-party data, as in social media,” said Nappinai. “Data protection laws address the need for consent of the user who is sharing content. Many times, the user isn’t sharing sensitive or personal information only about themselves; it can be about a much larger audience or set of data subjects. When one is dealing with that kind of data, which a third party has shared about a data subject, it is not enough to have only accountability or consent but also vesting of responsibility.”

“For now the least threshold of protection that the GDPR offers—i.e., of the ‘right to be forgotten’—ought to at least be codified in other jurisdictions including India to ensure protection of such third-party data that is shared, in effect without their consent,” she added.

Models for a new privacy protection framework

There are alternative mechanisms in the privacy toolkit and existing legal regimes that, in the appropriate contexts, are able to deliver privacy protection and meaningful control more effectively than consent. Though these mechanisms already exist, they must be better understood, further developed and more broadly accepted, suggest researchers at the International Association of Privacy Professionals (IAPP). Here are a few examples of such mechanisms.

• Legitimate-interest processing: This is particularly relevant, according to IAPP, as it provides the necessary flexibility to face future technology and business process changes, while requiring organizations to be proactive, think hard and consider and mitigate risks and harmful impacts on individuals as they process personal data.

Legitimate-interest processing can legitimize many ordinary business uses of data, such as improving and marketing a company’s own products or services, or ensuring information and network security.

It also plays an increasingly significant role in the context of Big Data, the Internet of Things and machine learning by enabling beneficial uses of data where consent is not feasible and the benefits of the proposed uses outweigh any privacy risks or other harmful impact on individuals.

• Focus on risk and impact on individuals: This approach, IAPP has said, puts individuals firmly at the centre of an organization’s information management practices and results in better protection and compliance for individuals, especially in contexts where individual consent is neither required nor feasible.

• Individuals’ rights to access and correction: The ability of individuals to have access to their data and be able to correct inaccurate or obsolete data is an essential mechanism of control that should be made available as widely as possible.

Access and correction are also intrinsically related to transparency and organizations may be able to innovate here too, IAPP researchers have noted.

• Fair processing: Fair processing is a standalone data protection principle in many data privacy laws in Europe and beyond. Over the years, practitioners and regulators have equated fairness with providing privacy notices to individuals. Fair processing, however, goes beyond privacy notices and IAPP researchers believe the time has come to resurrect this principle back into practice.

This is the third of a four-part series on privacy. Read the first part here and the second part here.