A Comprehensive Legal Analysis for Judicial Officers, Advocates, and Legal Scholars
I. INTRODUCTION:
THE SHIFT FROM PROTECTION TO OPERATIONAL VERIFICATION
The
Digital Personal Data Protection Act, 2023 established India’s foundational
commitment to protecting children’s personal data. However, the promulgation of
the Digital Personal Data Protection Rules, 2025 (effective November 13,
2025) marks a paradigmatic shift from declarative principles to
operationalised, technology-enabled verification mechanisms. This
transformation moves beyond philosophical commitments to create actionable,
enforceable protocols that demand technical precision and procedural rigour
from data fiduciaries.
The question that now demands urgent legal examination is not merely whether children’s data should be protected—but how to verify that protection through credible, scalable, and legally defensible mechanisms.
Read DPDP Rules 2025: Click here
Rule
10 of the DPDP Rules, 2025 answers this question through a multi-layered
verification architecture that distinguishes India’s approach from global
benchmarks while introducing unprecedented operational complexity for
compliance officers and judges alike.
II. THE
STATUTORY ARCHITECTURE: RULE 10 AND SECTION 9 OF THE DPDP ACT
A. The Foundational Mandate
Section
9 of the DPDP Act, 2023 establishes three non-negotiable principles:
1.
Verifiable Parental Consent
Requirement: Before processing any personal data of
a child (defined as any person under 18 years), a Data Fiduciary must obtain verifiable
consent from the parent or lawful guardian.
2.
Substantive Prohibitions: Data Fiduciaries are categorically prohibited from:
–
Tracking or monitoring a
child’s behaviour
–
Conducting targeted advertising
directed at children
–
Processing personal data in a
manner likely to cause harm to the child
3.
Proportionality Doctrine: Processing must be limited to the extent necessary to serve the
specified purpose and must not exceed data minimisation principles.
B. Rule 10: The
Operational Implementation
Rule
10 translates these statutory mandates into a technical and organisational
framework. The language—“appropriate technical and organisational measures”—echoes
Article 32(1) of the GDPR, signalling India’s intent to harmonise its standards
with international jurisprudence while maintaining contextual adaptation.
The critical distinction Rule 10 introduces is this: Parental consent is not merely requested; it must be verifiable,
traceable, and authenticated through documented identity and age
credentials.
III. THE
VERIFICATION ARCHITECTURE: MULTI-PATHWAY APPROACH
A. The Twin-Pillar
Verification Model
Rule
10 establishes two primary routes for verifying parental identity and
adulthood:
Pillar 1:
Reliable Details Already Held by the Data Fiduciary
When
a parent is already a registered user of the data fiduciary’s platform and has
previously provided verified identity and age details, the fiduciary may rely
on these reliable details held in its database.
Judicial Scrutiny Required: The term
“reliable” is not defined with mathematical precision. Courts will likely
interpret this through the lens of: - Whether the details were obtained through
government-authorised verification mechanisms (Aadhaar, PAN, DL, Passport) -
The temporal proximity of the stored details (how recent the verification?) -
Whether the details are cross-checked against authoritative government
databases
Risk Allocation: A fiduciary that relies
on outdated or unverified stored credentials faces potential breach liability.
If a person falsely claimed to be the child’s parent and was registered as
such, the fiduciary’s reliance on those stored details may not shield it from
liability.
Pillar 2:
Voluntarily Provided Details or Virtual Tokens
For
parents who are not existing users, Rule 10 permits verification through:
1.
Voluntarily Provided
Identity and Age Details: Parents may submit
identity documents directly (e.g., Aadhaar card, passport scans). This creates
a new data flow requiring stringent security protocols, as the fiduciary now
temporarily holds sensitive parental identity data for verification purposes only.
2.
Virtual Tokens Issued by
Authorised Entities: The rules recognise virtual
tokens—a cryptographic or hashed representation of identity and age data—as
a legitimate verification method. These tokens are issued by:
–
Entities entrusted by law with
maintaining identity and age records (e.g., Indian government agencies)
–
Entities appointed by such
authorised bodies
–
Digital Locker Service
Providers notified under the Information Technology
Act, 2000
B. The Digital
Locker Innovation: India’s Unique Mechanism
The
explicit inclusion of Digital Locker as an authorised verification
mechanism is a defining feature of India’s regulatory approach. Digital Locker,
administered under the Ministry of Electronics and Information Technology,
enables citizens to: - Store government-issued digital documents in a secure,
accessible vault - Generate verifiable tokens linked to Aadhaar - Share tokens
with third parties without disclosing underlying identity details
Regulatory Significance: By leveraging
Digital Locker, parents can authenticate themselves without downloading
physical documents, thereby reducing friction while maintaining authentication
rigour. This reflects a privacy-by-design approach: the fiduciary never
sees the actual identity details; it receives only a government-verified token.
Practical Implication: A parent creates
a child account on a social media platform. The platform prompts her to
authenticate via Digital Locker. She logs into her Digital Locker account
(which requires Aadhaar authentication), selects age and identity verification
tokens, and shares them with the platform. The platform receives a
cryptographic confirmation that an adult, verified by the government, has
consented. The parent’s identity details remain encrypted in the government’s
vault.
IV. THE
FOUR-CASE ILLUSTRATION: PARSING OPERATIONAL SCENARIOS
Rule
10 provides four detailed illustrations that decode its operation:
Case 1: Parent
Is a Registered User on the Fiduciary’s Platform
Scenario: Child (C) informs a Data
Fiduciary (DF) that she is a child and declares Parent (P) as her parent. P is
already a registered user of DF’s platform and has previously provided identity
and age details.
Procedural Requirement: - DF shall
enable P to authenticate through the fiduciary’s website, app, or other means -
DF shall check its records to confirm it holds reliable identity and age
details of P - DF shall verify that P is an identifiable adult - Only upon such
verification shall DF process C’s personal data
Legal Interpretation: The Rule
contemplates a database lookup and verification step. The fiduciary must
programmatically check (a) whether P’s details are on file, (b) whether those
details establish adulthood, and (c) whether the authenticating individual
matches the stored P. This necessitates audit trails.
Compliance Burden: A Data Fiduciary must
maintain and regularly audit the accuracy of parent identity records. Outdated
records create exposure. If a parent’s details were initially verified ten
years ago and are now accessed, the fiduciary may face allegations that it did
not exercise “due diligence.”
Case 2: Parent
Is NOT a Registered User; Child Initiates Account Creation
Scenario: C informs DF that she is a
child and declares P as her parent. P is NOT a registered user on DF’s
platform.
Procedural Requirement: - DF shall
enable P to identify herself through its website, app, or other means - DF
shall verify P’s identity and age by reference to: - Details issued by an
entity entrusted by law or the government with maintaining such details, OR - A
virtual token mapped to such details issued by an authorised entity, OR -
Details verified and made available by a Digital Locker service provider - P
may voluntarily provide such details
Operational Challenge: The fiduciary
must integrate with external verification systems. This is non-trivial. The
fiduciary might: - Integrate with UIDAI’s e-KYC API (if permitted under Aadhaar
Act) - Partner with Digital Locker service providers - Accept government-issued
ID scans verified through third-party verification services - Employ video-KYC
mechanisms
Legal Consequence: A fiduciary that
relies on unverified self-declaration without attempting external verification
faces breach liability. The Rule mandates a “check” by reference to authorised
sources, not mere receipt of user-provided documents.
Case 3: Parent
Initiates Account Creation; She Is a Registered User
Scenario: Parent P opens an account for
Child C directly. P is a registered user on DF’s platform.
Procedural Requirement: - DF shall
verify that it holds reliable identity and age details of P - DF shall confirm
that P is an identifiable adult
Distinction from Case 1: Here, the
parent is acting in her parental capacity at the outset, not the child. The
fiduciary must infer the parental relationship from P’s declaration and then
verify P’s adulthood, not merely P’s registration.
Verification Complexity: The fiduciary
must confirm that the person opening the account is indeed the adult registered
user. This demands password/authentication verification and cannot be confused
with mere account linkage.
Case 4: Parent
Initiates Account Creation; She Is NOT a Registered User
Scenario: P opens an account for C. P is
not a registered user on DF’s platform.
Procedural Requirement: - DF shall
verify P’s identity and age by reference to authorised sources or digital
tokens - P may voluntarily make such details available via Digital Locker or
other authorised means
Full Due Diligence Cycle: The fiduciary
must perform comprehensive external verification without any reliance on stored
records, increasing operational burden and authentication friction.
V. DEFINING
“ADULT,” “AUTHORISED ENTITY,” AND “DIGITAL LOCKER SERVICE PROVIDER”
A. Definition of “Adult”
Rule 10(2)
defines an adult as “an individual who has completed the age of eighteen
years.”
Judicial
Interpretation Issue: This definition is
straightforward but creates a verification challenge. Many digital platforms
serve users globally. How does a fiduciary verify that an individual has
completed 18 years when: - The individual’s legal identity is stored in a
foreign country? - The individual’s age is calculated based on different
calendar systems (e.g., Islamic or Buddhist calendar)? - The individual’s birth
certificate is in a language not recognised by the fiduciary’s verification
system?
India’s courts
may develop jurisprudence requiring fiduciaries to apply a reasonableness
standard: Did the fiduciary exercise reasonable diligence to verify
adulthood given the available verification mechanisms?
B. Definition of
“Authorised Entity”
An authorised
entity is defined as:
1.
An entity entrusted by law or
by the Central Government or State Government with the issuance of details of
identity and age or a virtual token mapped to such details
2.
A person appointed or permitted
by such entity for such issuance
3.
Entities whose details or
tokens are verified and made available by a Digital Locker Service Provider
Statutory
Scope: This definition encompasses: - UIDAI
(Aadhaar authority) - State-level registrar of births and deaths - Ministry of
External Affairs (Passport issuance) - Transport authorities (Driving License
issuance) - Ministry of Social Justice (disability/vulnerability registries) -
Any other entity designated by the Central Government
Rule-Making
Authority: The definition grants the Central
Government discretion to notify additional authorised entities. This is a
dynamic framework, subject to amendment.
C. Digital Locker
Service Providers
Rule
10(2)(c) defines Digital Locker Service Provider as:
“Such
intermediary, including a body corporate or an agency of the appropriate
Government, as may be notified by the Central Government, in accordance with
the rules made in this regard under the Information Technology Act, 2000.”
Current Status: The Government of
India’s Ministry of Electronics and Information Technology operates DigiLocker,
the primary digital locker service. However, the Rule contemplates that other
authorised intermediaries may be notified.
Regulatory Significance: By channelling
verification through notified Digital Locker Service Providers, Rule 10 creates
a tiered verification model: - Fiduciary -> Digital Locker Service
Provider -> Authorised Entity (e.g., UIDAI) -> Government Records
This
tiered approach reduces direct fiduciary exposure to identity data while
maintaining cryptographic auditability.
VI. EXEMPTIONS
FROM CHILD DATA PROTECTION OBLIGATIONS: RULE 12
Not
all data processing involving children requires verifiable parental consent.
Rule 12 carves out significant exemptions, structured in two parts:
A. Part A:
Classes of Data Fiduciaries (Exempted by Entity Type)
Certain
categories of entities may process children’s data without satisfying Rule 10’s
verifiable consent requirement, subject to specified conditions:
1. Clinical
Establishments, Mental Health Establishments, and Healthcare Professionals
Exemption Condition: Processing is
restricted to provision of health services to the child, to the extent
necessary for protection of her health.
Judicial Rationale: A doctor cannot be
required to obtain parental consent before treating a child in a medical
emergency. Similarly, healthcare data fiduciaries may process a child’s medical
records without verifiable parental consent when the purpose is therapeutic.
Scope Limitation: This exemption does
not permit healthcare fiduciaries to: - Process a child’s data for marketing
pharmaceutical products - Use a child’s health data for targeted advertising -
Share health data with insurers or third parties (without separate consent)
Regulatory Implication: If a hospital
subsidiary sells health data to an insurance company, verifiable parental
consent is required because the purpose has shifted from healthcare provision
to commercial processing.
2. Allied Healthcare
Professionals
Exemption
Condition: Processing is restricted to supporting
implementation of healthcare treatment and referral plans recommended by a
clinical professional.
Practical
Application: Physiotherapists, nutritionists, and
counsellors supporting a child’s treatment can process the child’s data without
separate parental consent when the processing is subordinate to the primary
healthcare purpose.
3. Educational Institutions
Exemption
Condition: Processing is restricted to tracking and
behavioural monitoring: - For the educational activities of the institution, OR
- In the interests of safety of children enrolled
Regulatory
Boundary: A school can monitor a child’s
attendance, academic performance, and conduct to enable education. A school
cannot use such data to profile a child’s interests for targeted advertising on
third-party platforms.
Cross-Border
Concern: If an educational technology provider
(EdTech) company receives data from schools, the provider must ensure that such
data is not re-purposed for profiling or targeted advertising without fresh
parental consent.
4. Child Care and
Crèche Operators
Exemption
Condition: Processing restricted to tracking and
behavioural monitoring in the interests of safety.
Operational
Scope: A crèche can track a child’s location,
health vitals, and daily activities to ensure safety. It cannot process such
data to build developmental profiles for sale to toy manufacturers.
5. School Transport
Operators
Exemption
Condition: Processing restricted to tracking
location of children during their commute to and from school.
Technical
Boundary: Real-time GPS tracking is permitted;
retention of location history for behavioral pattern analysis is not.
B. Part B:
Purposes for Which Verifiable Parental Consent Is Not Required
Part
B of Schedule IV exempts certain purposes, regardless of the fiduciary’s
entity classification:
1. Exercise
of Power, Performance of Function, or Discharge of Duty in Children’s Interests
Exemption: Processing restricted to the
extent necessary for exercise of power or performance of function under law.
Application: A child welfare board can
process a child’s data (including sensitive data) to investigate child abuse,
even without parental consent, because the legal duty to protect overrides the
consent requirement.
Procedural Safeguard: The rule requires
that processing be limited to the extent necessary and documented with clear
reasoning.
2. Provision
or Issuance of Subsidies, Benefits, Services, Certificates, or Licenses
Exemption: Processing by government or
policy-mandated entities.
Example: Under the Mid-Day Meal Scheme
or scholarship programs, government entities can process children’s personal
data without verifiable parental consent. The legal mandate to provide the
benefit permits processing without consent.
Scope: This exemption applies only to
government bodies or fiduciaries explicitly entrusted with such provision under
law or policy.
3. Account
Creation for Email Communication
Exemption: Processing restricted to the
extent necessary for email account creation.
Rationale: A child cannot meaningfully
participate in online education or government services without an email
account. The exemption permits email-only account creation with minimal data
collection.
Restriction: The exemption does not
permit collection of location data, device identifiers, or behavioral tracking
for email-only accounts.
VII. COMPLIANCE
OBLIGATIONS: TECHNICAL AND ORGANISATIONAL MEASURES
A. The Mandatory
Integration Requirement
Rule 10(1) mandates that fiduciaries “adopt appropriate technical
and organisational measures” to ensure verifiable parental consent.
Technical Measures Include:
1.
Age Verification APIs: Integration with e-KYC service providers or Digital Locker to
verify user age before processing personal data.
2.
Parental Authentication
Workflows: Building secure authentication flows
where:
–
Child declares parental
relationship
–
System prompts for parental
authentication
–
Parental identity is verified
against authorised sources
–
Consent is recorded with
timestamp, authentication method, and audit trails
3.
Consent Recording and Audit
Trails: Every instance of parental consent must be
logged with:
–
Date and time of consent
–
Identity of the parent (hashed
or tokenised)
–
Verification method used
(platform record, digital locker token, ID verification)
–
IP address or device
fingerprint
–
Express statement of purposes
consented to
4.
Withdrawal Mechanisms: Fiduciaries must enable parents to withdraw consent at any time
through a process as easy as providing consent.
5.
Data Minimisation
Implementation: Technical controls ensuring that
only data strictly necessary for the stated purpose is processed.
B. Organisational
Measures Include:
1.
Privacy By Design: Incorporating child-protective principles into system architecture
and product development lifecycle.
2.
Personnel Training: Compliance teams must understand Rule 10’s mechanics to audit
verification logs and investigate alleged breaches.
3.
Impact Assessments: For data fiduciaries processing significant volumes of children’s
data, Data Protection Impact Assessments (DPIA) focusing on:
–
Likelihood and severity of harm
to children
–
Verification methodology’s
reliability
–
Controls preventing secondary
processing
4.
Policies and Procedures: Written, accessible policies explaining how verifiable consent is
obtained, verified, and withdrawn.
5.
Third-Party Management: Contractual obligations on Data Processors and sub-processors to
maintain child-protective standards.
VIII.
JURISDICTIONAL IMPLICATIONS: THE ROLE OF COURTS
A. Breach
Adjudication Under Rule 7
When
a child’s data breach occurs and a data fiduciary claims exemption under
Schedule IV, courts will examine:
1.
Whether the exemption was
properly invoked: Did the fiduciary establish that
processing fell within the exempted class or purpose?
2.
Whether processing was
limited to the stated purpose: If a school exempted
under Part A processed children’s data for targeted advertising, the exemption
is vitiated.
3.
Whether “due diligence” was
observed: Even for exempted processing, fiduciaries
must observe reasonable care in protecting children’s data.
B. Standards of Proof
On the
Fiduciary: If accused of processing children’s data
without verifiable parental consent, the fiduciary must demonstrate through
documentary evidence that: - Parental consent was obtained - Verification was
performed against authorised sources - Audit trails support the verification
claim
For the Data
Protection Board (and ultimately, courts on appeal): The Board must scrutinise: - Whether verification methods employed
were reliable - Whether log retention meets the one-year requirement under Rule
6(1)(e) - Whether the fiduciary’s post-breach investigation revealed negligence
in the verification process
C. Emerging Jurisprudence
Indian
courts are likely to develop standards around:
1.
Reasonableness of
Verification: What constitutes “appropriate”
technical measures may vary by fiduciary size and capability. A startup’s
verification system need not match a large tech company’s, but both must meet
minimum standards.
2.
Temporal Validity of
Verification: How long can a fiduciary rely on a
single verification? Courts may require periodic re-verification (e.g.,
annually) for long-term data relationships.
3.
Cross-Border Verification: How can Indian fiduciaries verify parental identity for children or
parents in foreign jurisdictions? This remains unresolved.
IX. CONSENT
MANAGERS: THE MEDIATED VERIFICATION MODEL
A. Role of
Consent Managers Under Rule 4
While Rule 10 enables direct verification (fiduciary to
parent), the Act also contemplates mediated verification through Consent
Managers—intermediaries who maintain platforms enabling data principals to
provide, review, and withdraw consent.
Operational Mechanism for Child Data:
1.
A Consent Manager’s platform
enables a parent to authenticate herself (through Digital Locker or identity
verification)
2.
The parent declares her
relationship to a child
3.
The Consent Manager routes
consent requests from data fiduciaries to the authenticated parent
4.
The parent provides or denies
consent, with the Consent Manager maintaining immutable records
Liability Allocation: If a Consent
Manager facilitates verifiable parental consent for child data processing, the
fiduciary’s liability is reduced to risks of downstream breach. The Consent
Manager assumes liability for consent mediation accuracy.
Registration Requirements: Consent
Managers must: - Be Indian-incorporated companies - Maintain net worth of ₹2
crore minimum - Avoid conflicts of interest with data fiduciaries - Undergo
Board registration and audit
X. COMPARATIVE GLOBAL
ANALYSIS
How Does India’s
Approach Compare?
|
Jurisdiction |
Age
Threshold |
Verification
Requirement |
Verification
Methods |
Operational
Distinction |
|
EU
(GDPR) |
13-16
(varies by member state) |
Parental
consent |
No
prescribed methods; “reasonableness” standard |
Prescriptive
but flexible |
|
USA
(COPPA) |
Under
13 |
Verifiable
parental consent |
Signed
forms, credit card, video verification, govt. ID |
Technological
neutrality; diverse methods permitted |
|
Brazil
(LGPD) |
Under
13 |
Parental
consent |
No
prescribed methods specified |
Emerging
jurisprudence |
|
Singapore
(PDPA) |
Under
13 |
Parental
consent |
Not
statutorily prescribed |
Principles-based
approach |
|
India
(DPDP Rules) |
Under
18 |
Verifiable parental consent |
Prescribed:
Digital Locker tokens, government-issued ID, platform records, virtual tokens |
Prescriptive
with technology-enabled specificity; unique digital locker integration |
India’s Distinctive
Features:
1.
Highest Age Threshold: At 18 years, India protects children two years longer than EU and
five years longer than COPPA.
2.
Prescribed Verification
Methods: Unlike GDPR (which allows broad
“reasonableness”) or COPPA (which offers multiple options), India prescribes
specific methods, reducing fiduciary discretion.
3.
Digital Locker Mandate: India’s integration of government digital infrastructure (Digital
Locker/Aadhaar) creates a unique public-private verification model absent in
other jurisdictions.
4.
Operational Specificity: The four-case illustrations provide granular procedural guidance
absent in most jurisdictions.
XI. PRACTICAL
COMPLIANCE ROADMAP FOR FIDUCIARIES
Phase 1: Audit
and Classification (Immediate)
•
Identify all processing of
children’s data
•
Determine whether processing
falls within exempted classes or purposes
•
For non-exempt processing, map
existing verification systems
Phase 2:
Technical Implementation (Months 1-3)
•
Integrate e-KYC or Digital
Locker APIs
•
Build age verification
workflows
•
Implement consent recording and
audit logging
•
Redesign user onboarding for
children
Phase 3: Policy
and Process Development (Months 1-2)
•
Draft Privacy Policies
explaining child data handling
•
Create procedures for verifying
parental identity
•
Develop consent withdrawal
processes
•
Train compliance and customer
support teams
Phase 4: Vendor
Management (Ongoing)
•
Audit third-party processors’
child-protective controls
•
Establish contracts mandating
compliance
•
Periodic audits of processor
capabilities
Phase 5: Breach
Readiness (Ongoing)
•
Establish 72-hour breach
notification processes
•
Maintain audit trails for
potential breach investigations
•
Periodic testing of incident
response procedures
XII. JUDICIAL
IMPLICATIONS: KEY QUESTIONS FOR COURTS
A. The Reasonableness
Doctrine
Question: What constitutes “appropriate” technical measures? Is
reasonableness assessed by reference to: - Industry standards? - The
fiduciary’s size and resources? - The sensitivity of the child’s data category?
Precedent: Courts will likely adopt a proportionality framework,
requiring larger fiduciaries with greater resources to implement more rigorous
verification than startups, but setting a minimum floor for all.
B. Duty of Verification
Question: Does Rule 10 impose an absolute duty to verify parental
identity, or a duty of due diligence (reasonable efforts)?
Answer: The language “shall observe due diligence” suggests the latter.
However, if a fiduciary could have easily integrated Digital Locker but chose
not to, courts may find that due diligence was not observed.
C. Temporal Scope of Consent
Question: Once verifiable parental consent is obtained, for how long is it
valid?
Anticipated
Ruling: Consent likely remains valid for the
duration of the data relationship or purpose. Upon material change of purpose,
fresh consent is required. Upon significant passage of time (courts may
establish thresholds), re-verification may be required.
D. Burden of Proof
Question: Upon allegation of processing without verifiable consent, who bears
the burden of proving compliance?
Answer: Under Article 5(2) of the GDPR principle of accountability, the
fiduciary bears the burden. India’s framework is likely to adopt similar
standards.
XIII.
CONCLUSION: IMPLICATIONS FOR THE JUDICIARY
The
DPDP Rules, 2025, particularly Rule 10, represent a maturation of India’s
data protection framework from principles to procedures. For judicial
officers, advocates, and legal scholars, several takeaways emerge:
1.
Prescription Over
Discretion: Unlike prior data protection regimes
emphasizing fiduciary discretion, Rule 10 prescribes specific verification
methods, reducing subjective judgment and clarifying compliance standards.
2.
Technology as Regulatory
Tool: The framework integrates Digital Locker,
e-KYC, and virtual tokens into the enforcement mechanism itself, making
compliance inseparable from technological adoption.
3.
Tiered Accountability: Liability is distributed among fiduciaries, consent managers, and
digital locker service providers, requiring courts to analyze breach
responsibility across multiple actors.
4.
Emerging Jurisprudence: Courts will develop standards around proportionality,
reasonableness, and temporal validity of consent—areas where Rule 10 provides
procedural specificity but normative latitude.
5.
International Harmonisation: India’s approach seeks to balance GDPR’s prescriptive rigour with
COPPA’s technological flexibility, while leveraging India’s unique digital
infrastructure advantages.
For
practitioners, judges, and policymakers, Rule 10 signals that child data
protection is no longer aspirational but operational—enforced through
documented verification trails, technical integration, and institutional
accountability mechanisms.
The
judiciary’s role will be to interpret Rule 10 in a manner that: - Protects
children from data exploitation without imposing impossible compliance burdens
on fiduciaries - Recognizes India’s technological infrastructure as an asset
for verification - Develops proportionality standards ensuring that compliance
requirements scale with fiduciary capacity - Establishes clear precedents to
guide fiduciaries in implementing age verification and parental consent systems
This
framework inaugurates a new phase of data protection jurisprudence in India—one
where technology, law, and institutional accountability converge in service of
a fundamental right: a child’s data dignity.
REFERENCES
1.
Digital Personal Data
Protection Act, 2023 (Act No. 22 of 2023)
2.
Digital Personal Data
Protection Rules, 2025 (Notified November 13, 2025)
3.
Information Technology Act,
2000 (Act No. 21 of 2000)
4.
Rights of Persons with
Disabilities Act, 2016 (Act No. 49 of 2016)
5.
National Trust for the Welfare
of Persons with Autism, Cerebral Palsy, Mental Retardation and Multiple
Disabilities Act, 1999 (Act No. 44 of 1999)
6.
General Data Protection
Regulation (GDPR), European Union
7.
Children’s Online Privacy
Protection Act (COPPA), United States
8.
Digital Locker Infrastructure,
Ministry of Electronics and Information Technology, Government of India
.png)
No comments:
Post a Comment