Name of the Author(s) - Devyani Mishra, Rituraj Singh Parmar
University/Organisation/Profession - NLIU
Area of Law - Data Privacy
Introduction
Upon the expiration of my anti-virus protection, the app religiously notified me to renew the subscription every time I opened my laptop, however, the only two options I was offered were to either “accept the risk” or “reactivate the subscription”. This is one of the plethora of examples of Dark Patterns that one comes across while navigating through internet platforms and websites. Recently, on 30th November, 2023, the Central Consumer Protection Authority ("CCPA") notified the Guidelines for Prevention and Regulation of Dark Patterns, 2023 ("Guidelines") in order to prohibit use of dark patterns by any person including in all the platforms. The guidelines define Dark Pattern as any deceptive design configuration employed by different platforms to mislead users into doing something they didn't originally want to do. These patterns are crafted in a way that interferes with users' ability to make their own choices, amounting to misleading advertising or unfair practices that violate consumer rights.
While the introduction of the guideline is a positive move amidst the want of regulation of online deceptive behaviours, nonetheless, the guidelines have various lacunae which the authors through this article will try to bring forth. The introduction of new dark patterns under the guideline, in the opinion of authors, falls short of its motive of curbing deceitful online practices, since the definition clause is questionable. Firstly, the definition of dark pattern is restrictive, particularly, because the guideline is centred around the concept of “intention”. This provides leeway to platforms and advertisers to circumvent the law of dark patterns. Another issue is whether or not the user interface influenced or manipulated the decision-making process of the user. Furthermore, the guidelines are not inclusive of various dark patterns affecting privacy leading to unauthorised access to personal data. Therefore, lessons could be taken from California and EU laws to make the law more effective, which the authors will delve into in the later part of the article.
The restrictive definition of dark pattern:
The guideline seeks to put a stop to intentional deceitful actions of the platforms, which offers goods and services, by the way of dark patterns. The guideline provides for illustrative list of thirteen different kinds of dark patterns used by the platforms to influence the decision making of the users/consumers, and the contravention of the guidelines shall result in invocation of consequences under Consumer Protection Act.
It is pertinent to note that the guideline issued by CCPA is intention oriented in approach which in turns severely restricts the scope of regulation of dark patterns. The first segment of the definition of dark patterns within the Guideline incorporates wordings which emphasizes on the requirement that dark patterns must be employed with intent. The necessity of intention reduces the capacity of the Guidelines to adequately safeguard consumers against the practices enumerated in Annexure 1. For instance, Section 2(e) includes the phrase “designed to mislead or trick users” which implies the requirement of intention, however, an intentionality requirement, herein, would restrict the application of guidelines to curb dark pattern practices because, in today’s times, it could be contended that numerous services employ automated systems and programs, such as AI, to design or make platforms on their behalf. Therefore, a question arises, how can modern websites and service platforms, generated by machines, possess the intention that is inherently absent in programs? Furthermore, an individual cannot be deemed culpable in a situation, wherein, human intent is absent behind the formulation of these dark patterns. Another instance of a dark pattern without pre-existing intention could be the attack by viruses or bugs on the website or any platform. Nagging is one of the recognized dark patterns in the guidelines, which means repeated and frequent disruption of user enjoyment via requests or notifications. An attack by a virus might lead to continuous spamming of pop-ups without the knowledge or intention of the website owner, which would ultimately qualify as ‘nagging’. This inquiry is indeed a matter that warrants exploration and consideration in subsequent legislative endeavours. It is noteworthy that the approach adopted by the Digital Services Act of the European Union is not intent based and it inculcates a broad variety of services and dark patterns.
The above discussion brings us to another concern i.e., the definition under Section 2(e) provides that any user interface designed to trick consumers into taking decisions which they initially did not want to take or which influences consumers' decisions would be considered as Dark Patterns. In essence, the definition indicates that the user should be led into making a decision that may not ultimately serve their best interests. However, the question arises: if the user finds a way to avoid being deceived by such manipulative tactics or designs, does it cease to be a dark pattern? This is another facet of the definition clause that requires revision, as it could potentially be exploited by platforms seeking to avoid accountability under CCPA for employing dark patterns.
Need to recognise dark patterns data protection aspect
One significant aspect that the present guideline fails to address is the use of dark patterns to access the personal data of individuals. At the outset, the definition of dark patterns doesn't identify deceitful practices carried out to extract personal data, unlike as recognized by foreign jurisprudence on dark patterns. The California Consumer Privacy Act (“CCPA”) has provided detailed requirements to prohibit the use of dark patterns to hinder the users' right to opt in and opt out on any platform. Opt-in refers to requiring users to provide consent for the use of personal data and opt-out means withdrawing the same. It prohibits the usage of confusing and ambiguous language, such as double negatives (e.g., "Do Not Not Sell My Personal Information"), making consumers read and listen to reasons for not opting out, requiring consumers to provide personal information that is not necessary for opting out and making opting out an irreversible process, meaning that the withdrawal of consent takes more steps than giving consent. It also states that the consent of consumers to opt in shall not be taken using dark patterns.
Furthermore, the EU guidelines on dark patterns define dark patterns as interfaces requiring users to make harmful and unwilling decisions regarding the processing of personal information. It recognizes various kinds of dark patterns affecting personal data, with some examples being deceptive snugness (when data-invasive steps are turned on by default), lacking hierarchy (when information relating to data protection is presented in an ambiguous and unorganized manner), hidden in plain sight (providing information on data protection in a way that can be easily overlooked), and emotional steering (compelling users to share personal data by evoking highly positive or negative emotions). These examples are not exhaustive but illustrative in nature, and all of them are implicitly prohibited under the principles of transparency and lawfulness recognized under Article 12 and 6 of the General Data Protection Regulation (“GDPR”), respectively.
However, the Indian guidelines do not provide for the regulation of dark patterns affecting personal data, risking the personal data of individuals. The Digital Personal Data Protection Act, 2023, does talk about free, specific, and unambiguous consent in processing personal data and has a provision for a smooth opt-out process, i.e., the ease of withdrawing consent should be comparable to the ease with which such consent was given. Nonetheless, the act doesn't explicitly prohibit dark patterns, unlike the California Consumer Privacy Act (CCPA, which requires that consent taken through dark patterns is not considered valid consent.
Furthermore, the present framework doesn’t incorporate privacy by design and privacy by default. The former means design of platforms should be privacy-protection oriented. Integrating privacy by design into the initial stages of development would ensure establishment of appropriate data protection principles in the later stages. The latter, alternatively, is the concept that the privacy settings, should adhere to principles that consistently prioritize the best interests of the data subjects. The incorporation of Privacy by Default and Data Protection by Design is crucial when evaluating dark patterns, as it would ensure protection of data at the onset. In India, these were provided under section 22 of Personal data protection bill, 2019; however, the digital data protection act, 2023 is silent on this aspect. GDPR is prospective in nature, meaning thereby, it seeks to prevent and regulate dark patterns concerning personal data before default occurs. In contrast, the Indian framework is reactive, since it focuses on penalizing after the harm has already been done. The guidelines served an opportunity for the consumer protection authorities to assimilate privacy by design and privacy by default to regulate dark pattern affecting personal data, regrettably, the authorities failed to seize this chance.
Conclusion:
Dark patterns involve interaction of various laws including Consumer Protection Act, Contract act, and data protection. In this context, the Guidelines introduced by CCPA is a welcoming move. However, based on the analysis, it can be concluded that the guideline is not omnipotent as it suffers from multiple setbacks. The intention-based approach used by Indian regulatory authorities, at the very onset, would backfire and make it rather difficult to enforce the law since, there exists technical difficulties which will posit hindrance in establishment of intention.
Moreover, consent has emerged as the fundamental pillar of the Digital Personal Data Protection Act, 2023. Overlooking the identification of dark patterns impacting personal data would contradict the core principles of the act. Failure to acknowledge such patterns could result in consistent and serious violations of the right to privacy. Therefore, it is suggested that legislature shall incorporate privacy by design in the present Act and Guideline, thereby, prohibiting manipulation of consumers into consenting to the provision of personal data, by means of dark patterns or otherwise. Furthermore, amendments should be introduced to shift the focus from intention-based nature of the guideline, simply to increase the domain of accountability of the websites.
Comments