Centre for Internet & Society

Nudging in privacy notices can be a privacy-enhancing tool. For example, informing users of how many people would have access to their data would help them make a decision. However, nudges can also be used to influence users towards making choices that compromise their privacy. For example, the visual design of default options on digital platforms currently nudge users to share their data. It is critical to ensure that there is mindful use of nudges, and that it is directed at the well being of the users.

 

Edited by Elonnai Hickok and Amber Sinha


Former Supreme Court judge, Justice B.N. Srikrishna, who is currently involved in drafting the new data-privacy laws for India, was quoted recently by the Bloomberg[1]. Acknowledging the ineffectiveness of consent forms of tech companies that leads to users’ data being collected and misused, he asked if we should have pictograph warnings for consent much like the warnings that are given on cigarette packets. His concern is that an average Indian does not realise how much data they are generating or how it is being used. He attributed this to the access issues with the consent forms presented by companies which are in the English language. In the Indian context, Justice Srikrishna pointed out, considerations around literacy and languages should be addressed.

The new framework being worked on by Srikrishna and his committee comprising academics and government officials, would make the tech companies more accountable for data collection and use, and allow users to have more control over their own data. But, in addition to this regulatory step towards privacy and data protection, the concern towards communication of companies’ data practices through consent forms or privacy notices is also critical for users. Currently, the cryptic notices are a barrier for users, as are the services that do not provide incremental information about the use of the service - for example, what data is being shared with how many people or what data is being collected at what point, instead relying on blanket consent forms taken at the beginning of a service. Visuals can go a long way in making these notices and services accessible to users.

Although, Justice Srikrishna chose the extreme example of warnings on cigarette packets, visually depicting the health risks of cigarette smoking using repulsive imagery, the underlying intent seems to be of using visuals as a means of giving an immediate and clear warning about how people’s data is being used and by whom. It must be noted that the effectiveness of warnings on cigarette packets is debatable. These warnings are also a way in which manufacturers consider their accountability met, which is a possible danger with privacy notices as well. Most companies consider that their accountability is limited to giving all the information to the users without ensuring that the information is communicated to help the user understand the risks. Hence, one has to be cautious of the role of visuals in notices so that they are used with the primary purpose of meaningful communication and accessibility that can be used to inform further action. The visual summary of the data practice in terms of how it will affect the user will also serve as a warning.

The warning images on cigarette packets are an example of the user-influencing design approach called nudging[2]. While nudging techniques are meant to be aimed at the users’ well being, it brings forward the question of who decides what is beneficial for the users. Moreover, the harm in cigarette smoking is more obvious, and thus the favourable choice for the users is also clearer. But, in the context of data privacy, the harms are less apparent. It is difficult to demonstrate the harms or benefits of data use, particularly when data is re-purposed or used indirectly. There is also no single choice that can be pushed when it comes to the use and collection of data. Different users may have different preferences or degrees to which they would like to allow the use of their data. This raises deeper questions about the extent to which privacy law and regulation should be paternalistic.

Nudges are considered to follow the soft or libertarian paternalism approach, where the user is not forbidden any options but only given a push to alter their behaviour in a predictable way[3]. It is crucial to differentiate between the strong paternalistic approach that doesn’t allow a choice at all, the usability approach, and the soft paternalistic approach of nudging, as mentioned by Alessandro Acquisti in his paper, ‘The Behavioral Economics of Personal Information’[4]. In the usability approach, the design of the system would make it intuitive for users to change settings and secure their data. The soft paternalistic approach of nudging would be a step further and present secure settings as a default. Usability is often prioritised by designers. However, soft paternalism techniques help to enhance choice for users and lead to larger welfare[5].

Nudging in privacy notices can be a privacy-enhancing tool. For example, informing users of how many people would have access to their data would help them make a decision[6]. However, nudges can also be used to influence users towards making choices that compromise their privacy. For example, the visual design of default options on digital platforms currently nudge users to share their data. It is critical to ensure that there is mindful use of nudges, and that it is directed at the well being of the users.

The design of privacy notices should be re-conceptualised to ensure that they inform the users effectively, keeping in mind certain best practices. For instance, a multilayered privacy notice can be used, which includes a very short notice designed for use on portable digital devices where there is limited space, condensed notice that contains all the key factors in an easy to understand way, and a complete notice with all the legal requirements[7]. Along with the layering of information, the timing of notices should also be designed to be at setup, just in time of the user’s action, or at periodic intervals. In terms of visuals, infographics can be used to depict data flows in a system. Another best practice is to integrate privacy notices with the rest of the system. Designers are needed to be involved early in the process so that the design decisions are not purely visual but also consider information architecture, content design, and research.

Practice based frameworks should be developed for communication designers in order to have a standardised vocabulary around creating privacy notices. Additionally, multiple user groups and their varied privacy preferences must be taken into account. Finally, an ethical framework must be put into place for design practitioners in order to ensure that the users’ well being is prioritised, and notices are designed to facilitate informed consent. Further recommendations and concerns regarding the design of privacy notices, and the use of visuals can be read here.

Justice Srikrishna’s statement is an important step towards creating effective privacy notices with visuals. The conversation on the need to design privacy notices can lead to clearer and more comprehensible notices. Combined with the enforcement of fair collection and use of data by companies, well designed notices will allow users more control and a real choice to opt-in or out of a service and make informed choices as they engage with a service. Justice Srikrishna’s analogy seems to recommend using visuals to describe what type of data is being collected and for what purposes at the time of taking consent. Though cigarette warnings may not be the most appropriate analogy, this is a good start, and it is important to explore how visuals and design can be used throughout a service - from beginning to end - to convey and promote awareness and informed choices by users. It is also important to extend this conversation outside of privacy into the realm of security and understand how visuals and design can inform users’ awareness and personal choices around security when using a service.


[1] https://www.bloomberg.com/news/articles/2018-06-10/tech-giants-nervous-as-judge-drafts-first-data-rules-in-india

[2] http://www.ijdesign.org/index.php/IJDesign/article/viewFile/1512/584

[3] https://www.andrew.cmu.edu/user/pgl/psosm2013.pdf

[4] https://www.heinz.cmu.edu/~acquisti/papers/acquisti-privacy-nudging.pdf

[5] https://www.heinz.cmu.edu/~acquisti/papers/acquisti-privacy-nudging.pdf

[6] https://cis-india.org/internet-governance/files/rethinking-privacy-principles

[7] https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf

The views and opinions expressed on this page are those of their individual authors. Unless the opposite is explicitly stated, or unless the opposite may be reasonably inferred, CIS does not subscribe to these views and opinions which belong to their individual authors. CIS does not accept any responsibility, legal or otherwise, for the views and opinions of these individual authors. For an official statement from CIS on a particular issue, please contact us directly.