Legally Valid Consent Under India’s DPDP Act, 2023: Can Consent Managers Truly Ensure Informed Choice?
Legally Valid Consent Under India’s DPDP Act, 2023: Can Consent Managers Truly Ensure Informed Choice?
Vedant Raj Chaurasiya
Student, Amity Law School, Amity University Madhya Pradesh
Email ID: vedantrajchaurasiya@gmail.com
ABSTRACT
This article examines the legal validity of consent under India’s Digital Personal Data Protection Act, 2023, with a focus on the emerging role of consent managers within the Digital Public Infrastructure framework. It analyzes statutory definitions of consent, including requirements of free, informed, specific, and unambiguous approval by data principals. The paper evaluates whether consent managers effectively ensure genuine user choice or merely facilitate procedural compliance, considering design practices that may undermine informed consent. Comparative insights from the European Union’s GDPR and California’s CPRA are discussed to highlight global standards and challenges. Finally, the article proposes regulatory and technological reforms to strengthen consent mechanisms and better align practice with the law’s intent. The study aims to provide nuanced perspectives on bridging legal frameworks, user interface design, and enforcement to uphold meaningful data privacy protections in India.
KEYWORDS: Digital Personal Data Protection Act (DPDPA, 2023); Consent Managers; Informed Consent in Data Governance; Data Fiduciaries and User Rights; Consent-Based Legal Frameworks; Interface Design and Dark Patterns; Granular and Revocable Consent; Data Empowerment and Protection Architecture (DEPA); Comparative Analysis of GDPR and CPRA; Regulatory Design in Digital Infrastructure.
INTRODUCTION
The exponential growth of the digital economy in India has transformed the landscape of personal data governance. With rising digitization across public and private sectors—from e-commerce to e-governance—personal data has emerged as a critical asset and a potential liability. This transformation has prompted increasing regulatory focus on ensuring meaningful data protection, cantered primarily around the concept of user consent.
The Digital Personal Data Protection Act, 2023 (hereinafter “DPDP Act”) enshrines consent as the cornerstone for lawful data processing, framing it as both a right of the individual and a duty of the data fiduciary[i].
At the heart of this new data governance model lies the concept of legally valid consent that is, consent that is free, informed, specific, and unambiguous. While this formulation echoes global best practices, its practical enforceability in India remains contested. This is especially relevant given India's vast digital diversity and uneven levels of literacy and access. Recognizing these barriers, the Government of India introduced a novel institutional mechanism Consent Managers as part of its broader Digital Public Infrastructure (DPI) strategy. These intermediaries are designed to mediate consent transactions, purportedly empowering users with granular control over their data flows[ii].
Framed within the Data Empowerment and Protection Architecture (DEPA), Consent Managers are intended to serve as trusted, technology-neutral entities that help individuals manage and monitor consent across platforms, sectors, and services[iii]. They are envisioned not only as technical facilitators but as a layer of user-rights enablers who can promote ethical data use by design. Yet, despite their promise, the actual role and legal accountability of Consent Managers under the DPDP Act remains opaque.
This article investigates a critical legal question: Can Consent Managers, as introduced under India’s data protection framework, truly ensure that user consent meets the statutory standards of legal validity under the DPDP Act, 2023? In doing so, it examines the foundational legal definitions, the institutional role of Consent Managers, the regulatory and design challenges in operationalizing informed consent, and finally, how India’s approach compares to established regimes such as the European Union’s GDPR and California’s CPRA.
Ultimately, the analysis argues that while Consent Managers represent an innovative institutional experiment, their effectiveness in ensuring legally valid, informed consent depends on several yet unresolved challenges including regulatory clarity, user interface design norms, and accountability mechanisms. The paper concludes with reform-oriented proposals to bridge the gap between legislative ideals and real-world execution.
Legal Foundations of Consent in India’s DPDPA, 2023
- Statutory Definition and Components of Valid Consent
- Free, Informed, Specific, and Unambiguous Consent
At the core of India’s Digital Personal Data Protection Act, 2023 (DPDP Act) lies a normative shift toward individual autonomy and user-centric control. The Act’s definition of consent in Section 6(1) provides that it must be “free, specific, informed, unconditional and unambiguous with a clear affirmative action”[iv]. This framework is clearly inspired by globally respected standards such as the European Union’s GDPR, where informed and affirmative user action is a cornerstone of lawful processing. However, the DPDP Act refrains from elaborating on what qualifies as “free” or “unambiguous,” leaving the task of interpretation open-ended. This ambiguity raises risks of superficial compliance by platforms that technically meet these definitions without upholding their spirit. For instance, users may be overwhelmed by overly technical notices, presented in legal jargon or non-local languages, which obstruct true comprehension. Without statutory benchmarks for notice readability, linguistic accessibility, or layered consent architecture, the mere act of ticking a box or clicking “I Agree” may continue to pass as lawful consent under Indian law[v].
- Section 6 of the DPDP Act and Obligations of Data Fiduciaries
Section 6 not only defines consent but also outlines corresponding obligations for data fiduciaries, requiring that they issue prior notice containing the nature of personal data being collected, purposes of processing, and a grievance mechanism, among others[vi]. These obligations mirror consent regimes across international frameworks, but the Act stops short of mandating the format, sequence, or prominence of disclosures within digital interfaces. This omission can result in inconsistent implementation, allowing platforms to bury critical information in long privacy policies or rely on interface designs that discourage withdrawal of consent. Moreover, unlike the GDPR or California’s CPRA, there is no mandate for offering consent in a modular or granular fashion. This effectively permits bundled consent — where users have to either agree to all data uses or forgo the service entirely. The resulting dynamic significantly dilutes the meaningful exercise of user choice, especially when access to digital platforms is near-essential for services like banking, education, or health[vii].
- Consent as a Legal Justification vs. a Procedural Formality
Though the DPDP Act aspires to make consent a substantive legal basis for data processing, in practice, it risks becoming a procedural façade. Indian platforms have historically leaned toward check-the-box compliance, which continues to be enabled by the lack of interface certification standards, enforcement precedents, or auditable records of consent flow. Additionally, unlike the GDPR’s concept of “demonstrable consent,” the DPDP Act does not explicitly require that data fiduciaries maintain verifiable audit trails to prove the legitimacy of each consent instance[viii]. This opens the door for a scenario where data fiduciaries legally justify data processing through boilerplate consent mechanisms that remain opaque, coercive, or misleading. Without regulatory oversight that actively scrutinizes interface design, linguistic accessibility, or coercive defaults, the Act’s promise of empowering individuals through consent may ring hollow. Therefore, while Section 6 lays a comprehensive textual foundation, its translation into enforceable practice remains precarious and highly contingent on robust consent facilitation infrastructure.
- B. Responsibilities of Consent Managers Under the DPI Framework
- Legal Status under DPDP and DEPA Ecosystem
Consent Managers are a novel institutional innovation situated within India’s broader Digital Public Infrastructure (DPI), particularly under the Data Empowerment and Protection Architecture (DEPA). While the DPDP Act, 2023 does not define or regulate Consent Managers in explicit statutory terms, the Policy Frameworks on DEPA issued by NITI Aayog and the Ministry of Electronics and Information Technology (MeitY) recognize them as digital intermediaries tasked with routing, managing, and overseeing user consent flows across sectors like finance, health, and telecom[ix]. Under the Account Aggregator model the first instantiation of DEPA Consent Managers act as regulated entities (NBFC-AAs under the RBI) that must obtain user consent before data-sharing between financial institutions. The same architecture is now being applied to other domains, such as healthcare through the Ayushman Bharat Digital Mission, where Health Information Users (HIUs) rely on Consent Managers to access electronic health records. Despite their growing operational significance, the DPDP Act fails to provide Consent Managers with a clearly defined statutory recognition or accountability framework, creating a legal vacuum in terms of liabilities and oversight[x].
- Role in Facilitating vs. Verifying Lawful Consent
The primary function of Consent Managers is to facilitate user consent through secure and auditable digital interfaces. They act as technological conduits, not as legal guarantors of consent validity. In DEPA’s terminology, they enable “data flows based on granular, revocable, auditable and purpose-limited consent”[xi]. However, this facilitative role stops short of verifying whether the consent provided meets statutory standards—for example, whether it was truly informed or free from coercion. This distinction raises critical concerns. If a Consent Manager simply transmits a user’s agreement to share data, but does not evaluate the fairness of how that consent was obtained (e.g., deceptive UI, lack of disclosures), then the legal sufficiency of consent remains unexamined. The EDPB guidelines under the GDPR, for instance, require that consent mechanisms be assessed not only by legal standards but also by usability testing and transparency audits—something not yet incorporated into India’s framework[xii].
- Certification and Compliance Mechanisms
Despite their centrality in operationalizing consent, there are currently no uniform certification standards or compliance requirements for Consent Managers across sectors. In finance, Account Aggregators are licensed and monitored by the Reserve Bank of India, but in sectors like health and telecom, governance varies widely or is underdeveloped. The DPDP Act, though it vests enforcement powers in the Data Protection Board of India (DPBI), does not outline any framework for certifying or auditing Consent Managers, nor does it require public reporting of compliance failures or security incidents[xiii]. This regulatory lacuna risks turning Consent Managers into invisible bureaucracies with significant power but insufficient accountability. A robust certification scheme akin to the UK’s Age-Appropriate Design Code or Germany’s eIDAS Trust Framework would be essential to ensure that these actors uphold the values of transparency, data minimization, and lawful processing. Without such mechanisms, the promise of Consent Managers ensuring legally valid, meaningful, and autonomous consent remains only partially fulfilled[xiv].
Evaluating the Sufficiency of Consent via Consent Managers
- User Interfaces and the Illusion of Choice
- Interface Design and Default Opt-Ins
While the DPDP Act emphasizes consent as a cornerstone of data processing, the practical experience of end-users often reveals otherwise. Digital consent interfaces are frequently engineered to nudge users toward acceptance, using default opt-ins, pre-checked boxes, or design placements that make refusal cumbersome. These interface tactics distort user autonomy and call into question the ‘free’ nature of consent as required under Section 6 of the Act[xv]. A 2023 study by the Internet Freedom Foundation demonstrated that many popular Indian apps deploy design dark patterns that subtly coerce users into giving consent, often without reading or understanding the implications[xvi]. Such interface practices are not merely design flaws they operate at the intersection of law and behavioural economics. As seen in global jurisdictions like the European Union, regulators have raised concerns that consent obtained via manipulative UI fails the standard of “freely given” under Article 7 of the GDPR. India, despite modelling certain aspects of its framework on such global norms, has yet to integrate robust design standards into its consent enforcement regime[xvii]. Thus, while Consent Managers are envisaged as facilitators, they may inadvertently become conduits of these coercive practices if not actively regulated.
- Behavioral Economics: Dark Patterns and Fatigue
Consent fatigue where users repeatedly face consent prompts and thus default to agreeing significantly undermines the quality of consent. This is exacerbated by “dark patterns,” a term used to describe deceptive user experience practices that trick individuals into actions they might not intend to perform[xviii]. India’s DPI ecosystem, under which Consent Managers operate, currently lacks enforceable standards to counteract these subtle manipulations. The cumulative impact is that Consent Managers may technically fulfill procedural requirements without ensuring substantive legal compliance with “informed and unambiguous” consent. Behavioral studies reveal that overexposure to privacy prompts not only reduces engagement but also diminishes the likelihood of users reading or understanding privacy terms[xix]. Unless Consent Managers are bound by prescriptive interface norms that counteract these trends, they may fail to achieve the very objective they were introduced for ensuring meaningful user choice.
B. Regulatory Challenges in Ensuring Informed Consent
- Lack of Enforcement Guidelines and Penalties
Despite its progressive intent, the DPDP Act, 2023 suffers from notable regulatory ambiguity when it comes to the operationalization of informed consent. While Section 6 defines the essential attributes—free, informed, specific, and unambiguous—the Act fails to provide clear, enforceable benchmarks for what constitutes a violation of these standards in practice[xx]. There is no codified framework under which either the Data Protection Board of India (DPBI) or designated oversight bodies can evaluate whether Consent Managers are meeting their obligations in spirit and substance. This regulatory vacuum undermines accountability. For instance, although Clause 27 of the Act vests the Board with powers to impose financial penalties, these are reactionary and dependent on complaints or audits, neither of which are frequent or well-defined[xxi]. In comparison, the European GDPR regime empowers national authorities to issue guidelines, conduct proactive inspections, and require organizations to demonstrate compliance with design-centric consent standards. India currently lacks such pre-emptive enforcement tools, placing an undue burden on the end-user to detect violations and initiate redressal. Moreover, without sector-specific implementation rules or regulatory sandboxes, many Consent Managers operate with broad discretion, creating a fragmented compliance landscape. This inconsistency is particularly problematic when applied across sectors such as health tech, fintech, and e-commerce, where data sensitivity and literacy levels differ drastically. The absence of proactive compliance audits or mandated UI/UX consent audits makes the current regime vulnerable to superficial compliance[xxii].
- Fragmented Literacy and Digital Divide Barriers
A major impediment to meaningful consent in India lies in digital and linguistic accessibility. India’s digital population may be large, but it is not homogenous—millions access the internet through low-cost smartphones with limited data, and in regional languages often poorly supported by platforms or Consent Manager tools. In such circumstances, expecting users to navigate consent forms crafted in complex legal or technical jargon becomes unrealistic[xxiii]. The DPDP Act recognizes the importance of accessibility in principle but stops short of mandating specific measures such as multi-lingual consent notices, text-to-speech interfaces, or simplified visual explanations for illiterate users. The DEPA framework—which underlies the operation of Consent Managers—was designed with inclusion in mind, but its practical implementation often fails to address the literacy gap or accommodate those with limited digital fluency[xxiv]. Without standardized, inclusive consent architectures, Consent Managers may inadvertently exclude large portions of the population or create a consent mirage, where opt-ins occur without true understanding. Comparative jurisdictions have begun addressing this issue more concretely. For example, Brazil’s LGPD mandates “clear, adequate, and accessible” information regarding data processing, while South Africa’s POPIA encourages consent mechanisms adapted for the user’s comprehension level[xxv]. For India to uphold the constitutional right to informational self-determination, it must ensure that the means of giving consent are not merely legally sufficient but also practically meaningful for all citizens.
Comparative Legal Insights
- European Union (GDPR)
- Explicit Consent and Standard of “Informed”
The General Data Protection Regulation (GDPR) provides one of the most comprehensive legal frameworks for data protection globally, with explicit consent as a foundational legal basis for processing personal data under Article 6(1)(a)[xxvi]. Unlike India’s DPDP Act, the GDPR clearly differentiates between implicit and explicit consent, particularly for sensitive categories of data under Article 9, requiring that consent be “freely given, specific, informed, and unambiguous.” This standard is interpreted strictly, with recitals 32 and 42–43 emphasizing that silence, pre-ticked boxes, or inactivity do not constitute valid consent[xxvii]. Moreover, the GDPR mandates that consent must be demonstrable by the controller, thus placing a burden of proof not only on ensuring the consent was obtained, but that it was done in a manner that satisfies the elements of informed choice. This has led to jurisprudential and administrative enforcement, notably by authorities such as the CNIL (France) and the Irish DPC, who have invalidated consent mechanisms where user interfaces were overly complex, or where options were bundled in ways that negated specificity[xxviii]. India’s DPDP Act superficially mirrors GDPR’s consent framework but lacks equivalent interpretive depth and enforcement practice. While both emphasize “informed” consent, GDPR enforces this standard through active supervisory authorities and detailed regulatory guidelines, whereas the Indian model still awaits substantive interpretation and rule-making from the Data Protection Board of India.
- 2. EDPB Guidelines on Layered Consent Interfaces
A key strength of the EU’s consent architecture lies in the guidance issued by the European Data Protection Board (EDPB), which outlines best practices for designing consent flows that are accessible and legally compliant. In its 2020 guidelines on consent under Regulation 2016/679, the EDPB emphasized layered interfaces, where information is presented progressively to prevent overload while maintaining transparency[xxix]. These interfaces are particularly useful in mobile environments—comparable to India’s Aadhaar-linked, smartphone-driven data ecosystem. The EDPB has clarified that such mechanisms can uphold the “informed” requirement only if each layer offers clear navigability, unambiguous explanations, and real-time access to data processing purposes[xxx]. Any attempt to bury essential details under legalese or submenus is treated as undermining consent validity. Indian Consent Managers currently lack binding interface design norms akin to the EDPB guidelines. The DEPA framework has proposed principles such as “consent receipts” and auditability, but it stops short of enforcing presentation-layer standards or requiring usability testing for consent screens. Without analogous technical documentation or regulatory interpretation, Indian fiduciaries have wide latitude in UI design risking both user confusion and regulatory evasion. In sum, the GDPR not only defines informed consent robustly but also operationalizes it through continuous interpretive guidance and consistent enforcement factors largely absent in India’s present regime. This comparative insight highlights that legal recognition alone is insufficient; clarity must also emerge through implementation standards and public enforcement.
- B. California’s CPRA
- Granular Control and Opt-Out Defaults
The California Privacy Rights Act (CPRA), which amends and enhances the California Consumer Privacy Act (CCPA), emphasizes granular control by giving consumers distinct, easily exercisable rights over their personal data. While the CCPA introduced the right to opt out of the sale of personal data, the CPRA extends this by including sharing for cross-contextual behavioral advertising and requiring clear opt-out mechanisms—notably via a “Do Not Sell or Share My Personal Information” link[xxxi]. Unlike the DPDP Act, which centers on consent as an entry point, the CPRA builds a broader framework of data minimization, purpose limitation, and user empowerment, thereby reducing over-reliance on a single moment of consent. Importantly, the CPRA mandates granular opt-in for minors under 16 and requires affirmative authorization for selling or sharing their personal data, with a stricter threshold for those under 13, where parental consent is necessary[xxxii]. This granular model contrasts with India’s current approach, where consent often operates as a binary gateway without layered controls. Even though the DEPA ecosystem envisions user-centric control through Consent Managers, there is limited regulatory compulsion for opt-out design mandates, which are central to CPRA compliance.
- Consent Mechanisms for Minors and Sensitive Data
The CPRA introduces heightened obligations for the processing of sensitive personal information, a category that includes government identifiers, biometric data, geolocation, race, religion, and sexual orientation. Businesses must provide a clear notice and the right to limit the use and disclosure of such information via a “Limit the Use of My Sensitive Personal Information” link[xxxiii]. This aligns with GDPR’s emphasis on necessity and proportionality but goes further in embedding such safeguards in the user interface itself. Additionally, the California Privacy Protection Agency (CPPA)—an independent body created by the CPRA—has regulatory authority to prescribe technical specifications, such as universal opt-out mechanisms (like browser signals) that can automate consent withdrawal across platforms[xxxiv]. These tools operationalize informed choice in practice, reducing the cognitive burden on users and preempting interface manipulation. India’s DPDP Act, while acknowledging the need for verifiability and user autonomy, lacks such regulatory tooling. For example, it contains no requirement for standardized withdrawal mechanisms or real-time revocation of consent, nor does it recognize sensitive data classes with interface-level rights like CPRA does. As a result, Consent Managers under DPDP currently function more like passive brokers of consent rather than active guardians of user intent. Ultimately, the CPRA’s layered and enforceable consent infrastructure—especially for vulnerable users and high-risk data—provides a valuable blueprint. India could benefit from incorporating user-centric defaults, interface-based controls, and sector-specific obligations, all of which enhance both the legal sufficiency and practical enforceability of consent.
Reform Proposals and the Way Forward
- Regulatory Clarity and Design Guidelines
- Consent UI Audits and Standard-Setting by DPA
To bridge the gap between legal validity and practical enforceability, the most urgent need is for the Data Protection Board of India (DPBI) to issue detailed guidance on consent interface design, much like the European Data Protection Board's interface guidelines under GDPR. Presently, neither the DPDP Act nor its accompanying rules provide objective parameters for evaluating user interface elements that facilitate consent. An effective regulatory response would include mandatory consent UI audits periodic evaluations of how consent is collected, including scrutiny for dark patterns, deceptive defaults, or excessive granularity that overwhelms users. These audits should be part of Data Fiduciary accountability obligations, especially for Significant Data Fiduciaries under Section 10 of the DPDP Act[xxxv]. By institutionalizing regular assessments of interface design, the DPBI can convert abstract rights into functional safeguards, compelling platforms and Consent Managers to demonstrate usability compliance not just legal checkboxes. Regulatory standard-setting can also borrow from the Guidelines for Data Protection Impact Assessments (DPIA) under GDPR, requiring risk assessment of consent collection tools. The DPBI could frame a Code of Practice on Consent, specifying design principles (such as color contrast, font size, iconography, and button placement) to prevent interface manipulation and ensure real comprehension[xxxvi].
-
- Plain Language Disclosure Mandates
A major flaw in current consent practices is the persistent use of legalistic or technical language, which undermines comprehension. The DPDP Act mentions “informed” consent but does not explicitly mandate linguistic or cognitive accessibility, especially for users from marginalized or semi-literate populations. As a remedy, the DPBI should introduce plain language disclosure standards, inspired by instruments such as the UK's Plain Language Commission or the US SEC’s Plain Writing Act. Such standards must require that all consent notices be provided in clear, concise, and contextually adapted formats, possibly in local languages, and verified through user-testing protocols. This is particularly vital in India’s multilingual environment, where misunderstanding due to translation or jargon is not just common but expected. Implementing disclosure readability tests, like the Flesch-Kincaid score or its Indian linguistic equivalent, could be used as compliance benchmarks[xxxvii]. Furthermore, consent notices should distinguish between core and optional data uses, using layered interfaces that allow the user to understand consequences of their choices. This echoes GDPR’s layered consent approach and CPRA’s separate notice requirements for sensitive data. Consent Managers must thus move from being mere gateways to becoming facilitators of intelligibility, backed by design and language norms enforced through regulation.
- B. Technological Safeguards and Independent Audits
- Interoperable Trust Frameworks
While Consent Managers serve as intermediaries between users and Data Fiduciaries, their effectiveness hinges on integration within a technologically robust and interoperable trust ecosystem. The current DEPA (Data Empowerment and Protection Architecture) model endorsed by NITI Aayog provides a foundational vision, but without enforceable standards, it risks remaining aspirational[xxxviii]. To enhance interoperability and public trust, a consent verification ledger either blockchain-based or maintained via certified APIs could ensure that all data access requests and consents are auditable and immutable. This system would empower users with transaction logs detailing when and how their data was accessed, and by whom. Models such as IndiaStack and the Account Aggregator Framework demonstrate this potential, but lack uniform safeguards across sectors[xxxix]. Moreover, interoperability standards must be not only technical but also regulatory, requiring that all certified Consent Managers use common ontologies, taxonomies, and consent artefacts. The DPBI could adopt protocols like MyData from Finland, or the Solid framework from the EU’s NGI initiative, to mandate machine-readable, revocable, and context-aware consent objects[xl]. This would guard against vendor lock-in and ensure that users retain control over their data even when switching platforms.
- Periodic Validation of Consent Flows
The legitimacy of a consent-based regime cannot rest on a one-time user action. Instead, platforms must implement periodic consent validation protocols, especially where data use evolves significantly from the original purpose. This principle, grounded in the GDPR’s emphasis on “ongoing validity” of consent, demands that Data Fiduciaries re-seek consent or provide automated re-confirmation interfaces under specific triggers (e.g., change in privacy policy, new data-sharing partners)[xli]. The DPBI could operationalize this through third-party audits, akin to financial audits, conducted by certified consent flow evaluators. These auditors would assess whether the Consent Managers and Data Fiduciaries provide users with contextually relevant nudges, reminders of data retention periods, and opportunities to withdraw consent without degradation of service quality. Such audits should be submitted annually and made available for public scrutiny, following the model of the Transparency Reports mandated under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021[xlii]. Finally, periodic audits must include usability testing, especially in regional languages and with vulnerable users (elderly, children, and semi-literate populations). Consent that cannot be understood is not legally valid. Hence, the future of Consent Managers lies not in legal formalism, but in verifiable trust, transparent architecture, and universal design compliance.
- Conclusion
The DPDP Act, 2023 envisions a consent-centric data protection regime, placing informed, free, and specific consent at the heart of lawful data processing. In theory, Consent Managers offer a systemic innovation serving as fiduciary-neutral facilitators of user autonomy within India’s evolving digital infrastructure. However, as this paper has shown, there is a considerable gap between the formal legal requirements of consent and the realities of its implementation.
Despite legislative clarity under Section 6 of the DPDP Act[xliii], the reliance on digital interfaces and behavioral defaults undermines the true voluntariness and comprehension of user consent. Consent Managers, while a novel solution, are currently framed more as data intermediaries than as stewards of informed decision-making. Their role, unless backed by enforceable duties of care, plain-language standards, and accessibility mandates, risks reducing consent to a checkbox exercise devoid of actual legal significance[xliv]. Further, regulatory fragmentation, inadequate penalties, and infrastructural inequalities—especially digital illiteracy and interface fatigue—threaten to create a compliance façade without substantive protection. International parallels like the GDPR and CPRA reinforce the need for layered, granular, and contextual consent mechanisms, backed by real-time user agency and independent oversight[xlv]. Moving forward, the reform agenda must rest on three pillars: (i) regulatory precision, especially in defining informed consent thresholds; (ii) technological safeguards, including interoperable frameworks and periodic validation; and (iii) user-centered design, supported by plain language policies and robust audits. In this context, Consent Managers must transition from passive facilitators to active guardians of legality, tasked with verifying not just that consent was collected—but that it was collected lawfully, clearly, and meaningfully[xlvi]. Ultimately, India’s data protection ecosystem will only be as strong as its weakest interface. If Consent Managers are to serve as digital fiduciaries, they must earn public trust not through opaque certifications, but by demonstrating consistent, user-verified adherence to the spirit of informed consent[xlvii].
In the final analysis, while the DPDP Act, 2023 aspires to uphold individual autonomy through consent-centric governance, the operational reality of digital ecosystems demands more than statutory compliance it requires systemic empathy and design accountability. Consent Managers must not become silent gatekeepers of procedural formalities but must instead evolve into transparent, legally accountable guardians of meaningful user choice. Until then, the Indian data protection regime risks perpetuating a false sense of control where “consent” exists in law but not in lived digital experience[xlviii].
[i] Digital Personal Data Protection Act, 2023, § 6 (India).
[ii] NITI Aayog, Designing the DEPA Architecture: Consent-based Data Sharing in India, available at https://www.niti.gov.in/sites/default/files/2020-09/DEPA-Book_0.pdf (last visited May 28, 2025).
[iii] Reserve Bank of India, Master Direction - Non-Banking Financial Company - Account Aggregator (Reserve Bank) Directions, 2016, Notification No. DNBR.PD.009/03.10.119/2016-17 (Sept. 2, 2016), available at https://rbi.org.in/scripts/BS_ViewMasDirections.aspx?id=10598 (last visited May 28, 2025
[iv] Digital Personal Data Protection Act, No. 22 of 2023, § 6(1), Acts of Parliament, 2023 (India).
[v] Rahul Matthan, Exploring India’s Data Protection Law, Interpreting India Podcast, Carnegie Endowment for International Peace (Sept. 7, 2023), https://carnegieendowment.org/podcasts/interpreting-india/exploring-indias-data-protection-law-with-rahul-matthan?lang=en.
[vi] DPDP Act, supra note 4, § 6(3).
[vii] Mozilla Foundation, “State of Online Privacy Reaches ‘Very Creepy’ Level, Finds Mozilla’s First-Annual Consumer Creep-O-Meter,” Mozilla Foundation (2023), https://foundation.mozilla.org/en/blog/state-of-online-privacy-reaches-very-creepy-level-finds-mozillas-first-annual-consumer-creep-o-meter/
[viii] European Data Protection Board (EDPB), “Guidelines 05/2020 on Consent under Regulation 2016/679,” Version 1.1 (May 2020), https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_202005_consent_en.pdf.
[ix] NITI Aayog, Data Empowerment and Protection Architecture (DEPA): Empowering Citizens with Control Over Their Data, 2020, https://www.niti.gov.in/sites/default/files/2020-09/DEPA-Book_0.pdf.
[x] NITI Aayog, “Data Empowerment and Protection Architecture (DEPA): Empowering Citizens with Control Over Their Data,” NITI Aayog (2020), https://www.niti.gov.in/sites/default/files/2020-09/DEPA-Book_0.pdf.
[xi] Ministry of Electronics and Information Technology (MeitY), “Year-End Review 2024: Transforming India’s Digital Landscape Through Strategic Initiatives,” DD News (2024), https://ddnews.gov.in/en/year-end-review-2024-transforming-indias-digital-landscape-through-strategic-initiatives/
[xii] European Data Protection Board (EDPB), Guidelines 05/2020 on Consent under Regulation 2016/679, Version 1.1 (May 2020), https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_202005_consent_en.pdf
[xiii] DPDP Act, supra note 4, § 27–28 (establishing the Data Protection Board without sector-specific mandates).
[xiv] World Economic Forum, “Why India's DPI is Now Attracting Worldwide Interest,” World Economic Forum (2023), https://www.weforum.org/agenda/2023/08/the-international-significance-of-indias-digital-public-infrastructure/
[xv] The Digital Personal Data Protection Act, No. 22 of 2023, § 6, Acts of Parliament, 2023 (India).
[xvi] Internet Freedom Foundation, Comments on the Draft Guidelines on Prevention and Regulation of Dark Patterns, Internet Freedom Foundation (Oct. 5, 2023), https://content.internetfreedom.in/api/files/divco3ywedt9rpe/vjgsta3321356wn/iff_2023_042_iff_s_comments_on_draft_guidelines_on_dark_patterns_05_10_2023_RC6CGI8oDR.pdf.
[xvii] European Data Protection Board, Guidelines 05/2020 on Consent under Regulation 2016/679, Version 1.1 (May 2020), available at https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_202005_consent_en.pdf
[xviii] Arunesh Mathur et al., Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites, 3 Proc. ACM Hum.-Comput. Interact. 1 (2019), available at https://doi.org/10.1145/3359183.
[xix] Solove, Daniel J., Privacy Self-Management and the Consent Dilemma, 126 Harv. L. Rev. 1880, 1900–1905 (2013).
[xx] The Digital Personal Data Protection Act, No. 22 of 2023, § 6, Acts of Parliament, 2023 (India).
[xxi] Id. at § 27; see also Anirudh Burman, Understanding India's New Data Protection Law, Carnegie Endowment for International Peace (Oct. 3, 2023), https://carnegieendowment.org/research/2023/10/understanding-indias-new-data-protection-law.
[xxii] NITI Aayog, Data Empowerment and Protection Architecture: A Secure Consent-Based Data Sharing Framework, Discussion Paper (Aug. 2020), https://www.niti.gov.in/sites/default/files/2023-03/Data-Empowerment-and-Protection-Architecture-A-Secure-Consent-Based.pdf.
[xxiii] Jemis Mali & Falak Mehta, Bridging the Digital Divide in India: Barriers to Adoption and Usage, Indian Council for Research on International Economic Relations (ICRIER) (2023), https://icrier.org/pdf/Bridging_the_Digital_Divide_in_India.pdf.
[xxiv] NITI Aayog, Data Empowerment and Protection Architecture: A Secure Consent-Based Data Sharing Framework, Discussion Paper (Aug. 2020), https://www.niti.gov.in/sites/default/files/2023-03/Data-Empowerment-and-Protection-Architecture-A-Secure-Consent-Based.pdf.
[xxv] Lei Geral de Proteção de Dados Pessoais (LGPD), Law No. 13.709/2018, art. 6, de 14 de agosto de 2018, D.O.U. de 15.08.2018 (Braz.); Protection of Personal Information Act 4 of 2013 § 5 (S. Afr.).
[xxvi] Regulation (EU) 2016/679 of the European Parliament and of the Council, art. 6(1)(a), 2016 O.J. (L 119) 1 [hereinafter GDPR].
[xxvii] GDPR, rec. 32, 42–43.
[xxviii] Commission Nationale de l’Informatique et des Libertés (CNIL), Délibération SAN-2019-001 du 21 janvier 2019 (Jan. 21, 2019), https://www.cnil.fr/sites/cnil/files/atoms/files/san-2019-001.pdf; Data Protection Commission (Ireland), Inquiry into Meta Platforms Ireland Limited (Facebook Service), Decision No. IN-18-5-5 (Dec. 31, 2022), https://www.dataprotection.ie/sites/default/files/uploads/2024-12/Meta-Final-Decision-IN-19-4-1-Redacted.pdf.
[xxix] European Data Protection Board, Guidelines 05/2020 on Consent Under Regulation 2016/679, v.1.1 (May 4, 2020), https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_202005_consent_en.pdf.
[xxx] Id. at 9–10.
[xxxi] Cal. Civ. Code § 1798.135(a)(1) (West 2023); see also California Privacy Protection Agency, Final Regulations Text Implementing the California Privacy Rights Act (Mar. 29, 2023), https://cppa.ca.gov/regulations/pdf/20230329_final_regs_text.pdf.
[xxxii] Cal. Civ. Code § 1798.120(c) (West 2023).
[xxxiii] Cal. Civ. Code § 1798.121(a)–(c) (West 2023).
[xxxiv] California Privacy Protection Agency, Final Statement of Reasons for Regulations Implementing the CPRA, Cal. Code Regs. tit. 11, §§ 7000–7304, at 22–26 (Mar. 29, 2023), https://cppa.ca.gov/meetings/materials/20241004_item4_draft_final_statement_of_reasons.
[xxxv] Digital Personal Data Protection Act, No. 22 of 2023, § 10 (India).
[xxxvi] European Data Protection Board, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, Version 2.0 (Oct. 20, 2020), https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-42019-article-25-data-protection-design-and_en.
[xxxvii] U.S. Securities and Exchange Commission, A Plain English Handbook: How to Create Clear SEC Disclosure Documents (1998), https://www.sec.gov/pdf/handbook.pdf.
[xxxviii] NITI Aayog, Data Empowerment and Protection Architecture (DEPA): Empowering People with Control Over Their Data, Discussion Paper (Aug. 2020), https://www.niti.gov.in/sites/default/files/2023-03/Data-Empowerment-and-Protection-Architecture-A-Secure-Consent-Based.pdf
[xxxix] Reserve Bank of India, Master Direction – Non-Banking Financial Company – Account Aggregator (Reserve Bank) Directions, 2016, https://rbi.org.in/scripts/BS_ViewMasDirections.aspx?id=10598
[xl] MyData Global, The MyData Declaration (2020), https://mydata.org/declaration; Solid Project, Inrupt – Solid Specifications, https://solidproject.org/TR/.
[xli] European Data Protection Board, Guidelines 05/2020 on Consent under Regulation 2016/679, Version 1.1 (May 4, 2020), https://edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-052020-consent-under-regulation-2016679_en.
[xlii] Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, G.S.R. 139(E), § 4(1)(d) (India).
[xliii] The Digital Personal Data Protection Act, No. 22 of 2023, § 6 (India).
[xliv] Apar Gupta, Articles on Data Governance, Internet Freedom Foundation, https://internetfreedom.in/author/apar/ (last visited May 29, 2025).
[xlv] European Union Agency for Fundamental Rights, Your Rights Matter: Data Protection and Privacy – Fundamental Rights Survey 2021 (2021), https://fra.europa.eu/en/publication/2021/fundamental-rights-report-2021.
[xlvi] Solove, Daniel J., & Hartzog, Woodrow, The FTC and the New Common Law of Privacy, 114 Colum. L. Rev. 583, 587–601 (2014).
[xlvii] Rahul Matthan, Exploring India’s Data Protection Law, Interpreting India Podcast, Carnegie Endowment for International Peace (Sept. 7, 2023), https://carnegieendowment.org/podcasts/interpreting-india/exploring-indias-data-protection-law-with-rahul-matthan?lang=en.
[xlviii] Jef Ausloos & Ingrida Milkaite, Children’s Data Protection in the EU: From Consent to Fairness, 19(3) Int'l J. Law & Info. Tech. 180, 182 (2020).