India's landmark privacy law — decoded from 21 pages of legalese into something a human can actually understand and use.
Every time you install an app, sign up for a website, or swipe your loyalty card, your personal data — your name, phone number, location, purchase history — is collected. For decades, companies in India could do almost anything with that data. The DPDPA 2023 changes that permanently.
This Act is India's first comprehensive digital privacy law. It establishes a simple but powerful idea: your personal data belongs to you, not to the companies that collect it. Companies that want to use your data must ask, must explain why, and must protect it — or face massive fines.
Think of it as India's answer to Europe's GDPR (General Data Protection Regulation). It creates new rights for every Indian citizen, new duties for every company that handles data, a brand new regulator called the Data Protection Board, and penalties reaching up to ₹250 crore per violation.
Whether you're a student, a business owner, a developer, or just someone who uses apps and websites — this law directly affects your daily digital life. Understanding it is no longer optional.
Laws are written in their own language. Before you can understand what the DPDPA requires, you need to learn the vocabulary. These six terms appear on almost every page — master them and everything else unlocks.
You (Data Principal) deposit money at HDFC Bank (Data Fiduciary). HDFC hires a cash-in-transit company (Data Processor) to move money between branches. A Consent Manager is like a financial advisor who manages all your banking authorisations in one app. And RBI (Data Protection Board) is the regulator watching over everyone.
Section 3 defines the law's jurisdiction. It has a very long arm — it reaches Indian companies, foreign companies serving Indian users, and even data that started offline but was later digitised.
Scenario: A US-based SaaS company sells software subscriptions to Indian businesses and stores Indian user data on servers in the USA. Does Indian law apply?
Answer: YES. Because they are offering services to Data Principals within India, this Act applies to them — even though they're based in America and their servers are there.
Priya maintains a personal diary app with her daily thoughts. She processes her own data for purely personal use. The DPDPA does not govern this.
A wellness startup collects Priya's health data through their app to "improve their product." Even if Priya consented, the startup must follow all DPDPA rules.
Sections 4, 5, and 6 form the heart of the Act. The DPDPA says companies can only process your data if you've either (a) specifically consented, or (b) there's a legitimate legal reason. Chapter 3 covers consent. Chapter 4 covers legitimate reasons.
Before a company asks for your consent, it must give you a clear, plain-language notice telling you:
BAD (what used to happen): "By continuing you agree to our Privacy Policy." — A link to 8,000 words of legal text in English only.
GOOD (what Section 5 requires): A clear popup that says: "We will collect your name, phone number, and location to deliver your order. You can withdraw this consent anytime by going to Settings → Privacy → Withdraw Consent. To file a complaint, visit board.gov.in."
Not just any "I agree" qualifies. For consent to be legally valid under the Act, it must be ALL of the following:
| Quality | What it Means | What Violates It |
|---|---|---|
| Free | Not given under pressure or as a condition for basic service | "You can't use our app unless you consent to targeted ads" |
| Specific | Given for one particular purpose, not a blanket "everything" | "We may use your data for all our business purposes" |
| Informed | You know what you're agreeing to | Hidden in fine print or vague jargon |
| Unconditional | No strings attached | "Consent to marketing, or we won't process your refund" |
| Unambiguous | A clear yes — not assumed from silence or pre-ticked boxes | Pre-checked boxes, or "If you don't reply, we'll assume you agree" |
Even with valid consent, a company can only collect the minimum data necessary for the stated purpose. Consent for A doesn't mean unlimited access to everything.
Scenario: You download a telemedicine app. The app asks for consent to (1) process your health data for consultations, and (2) access your phone's contact list. You agree to both.
Result under Section 6: Your consent for the contact list is INVALID — contacts are not necessary for telemedicine. The app cannot use that data even though you technically clicked "agree."
You have the absolute right to withdraw consent at any time. And here's the key rule: withdrawing must be as easy as giving. If you can consent in one tap, you must be able to revoke in one tap.
If you withdraw consent, the company can stop providing the service — but it cannot undo lawful processing that already happened. Example: You withdraw consent from a food delivery app mid-order. The app may stop your future orders, but it must still deliver the order you already paid for.
Section 6(7) introduces the Consent Manager concept. Instead of managing consent separately on 50 different apps, you can use a single Consent Manager platform to give, review, and withdraw consent across all of them. The Consent Manager acts on your behalf and is accountable to you.
A Consent Manager is like a master password app (like 1Password or Bitwarden) — except instead of storing passwords, it stores and manages all your data permissions. One place. Full control. No more hunting through 50 apps to find where you agreed to let someone use your data.
Section 7 lists situations where companies and the government can process your data without your explicit consent. These are called "certain legitimate uses." They're not loopholes — they're carefully defined situations where requiring individual consent every time would be impractical or harmful.
| # | Situation | Real Example |
|---|---|---|
| 1 | You voluntarily shared data for a purpose and didn't object | You give your number to a pharmacy to receive a payment receipt SMS |
| 2 | State providing a subsidy, benefit, or service you previously consented to | Government uses your Aadhaar data to check if you qualify for a second benefit scheme |
| 3 | State performing a legal function (law enforcement, national security) | Police accessing call records during an investigation under a court order |
| 4 | Complying with a legal obligation to disclose information | A bank reporting suspicious transactions to the Financial Intelligence Unit as required by law |
| 5 | Complying with a court judgment or decree | A company sharing employee data as ordered by a labour court |
| 6 | Medical emergency — threat to life | Hospital using your blood group data without asking because you arrived unconscious |
| 7 | Epidemic or public health threat | Government contact-tracing during a pandemic |
| 8 | Disaster or breakdown of public order | Rescue teams using location data during a flood |
| 9 | Employment-related processing (preventing espionage, protecting trade secrets) | Company monitoring access to classified files on its internal systems |
A doctor treating an unconscious accident victim doesn't need to get signed consent before saving their life. Similarly, Section 7 exemptions exist for situations where waiting for consent would cause real harm — emergencies, legal obligations, and critical state functions. The law is humane, not rigid.
Section 8 is the most detailed section for businesses. It lists exactly what every company that collects your data must do — regardless of size, regardless of industry. Think of it as the company's checklist.
Even if a company outsources data processing to another vendor, the Data Fiduciary is still fully responsible. You can't outsource liability. If a third-party cloud service leaks your data, the company that hired them is still on the hook.
If data is likely to be used to make a decision that affects you (a loan application, a background check), or will be shared with another company, the Data Fiduciary must ensure it is accurate, complete, and consistent.
Your credit bureau score contains an error showing a loan you never took. A bank queries this bureau before approving your home loan. The bureau (Data Fiduciary) is obligated under Section 8(3) to maintain accurate data — because it's being used to make a decision that directly affects you.
Companies must protect data with reasonable technical and organisational measures. The specific safeguards are detailed in the Rules — encryption, access controls, audit logs, backups. "Reasonable" is judged against the risk level of the data.
If there's a breach, companies must notify both the Board and every affected individual. The Rules specify a 72-hour window for the Board notification. The individual notification must happen immediately.
A company must delete your data (and instruct its processors to delete it) as soon as EITHER of two things happens, whichever comes first:
If you stop using a service — you neither contact the company nor exercise your rights — for a specified period (set by the Rules: 3 years for large platforms), the company must assume the purpose is no longer being served and automatically erase your data. You don't need to explicitly ask.
You stopped using a food delivery app in 2018. Your name, address, credit card details, and order history still sat in their database in 2024. No deletion required.
If you don't use the app for 3 years, the company must erase your data — even without you asking. It must warn you 48 hours before doing so, giving you a chance to re-engage.
Every company must publicly publish on its website: the name and contact details of a person (or a Data Protection Officer for big companies) who can answer your questions. And it must have an effective system to handle your complaints — within the timeframe specified by the Rules (90 days).
Chapter III (Sections 11–15) is the most empowering part of the Act. These rights belong to every Indian whose data is processed digitally. They're not suggestions — they're legally enforceable entitlements. But rights come with duties too.
You can ask any company holding your data to tell you:
Imagine you can walk into any company and say, "Show me your file on me." Section 11 gives you that right digitally. No more mystery about who knows what about you.
If your data is wrong, incomplete, or outdated, you can demand that it be:
Your name is misspelled in a telecom company's records as "Rahool" instead of "Rahul." Because of this, your name on bills is wrong. Under Section 12(2), you have the right to demand they correct it — and they must.
If a company fails to meet its obligations or violates your rights, you have the right to file a formal complaint — and they must respond within the prescribed time. You must first try to resolve it with the company before approaching the Board. The Board is the last resort, not the first.
You can nominate another person — a family member or trusted individual — to exercise your data rights on your behalf if you die or become incapacitated. This is like a digital power of attorney specifically for your personal data rights.
Why this matters: This is especially meaningful for elderly parents who may not be tech-savvy. Their adult children can be nominated to manage data rights on their behalf if they become unable to do so.
Rights and duties are two sides of the same coin. The Act also lists what you owe to the system:
If you file a false or frivolous complaint, the Board can impose a penalty on YOU — up to ₹10,000. The Act protects everyone, including companies from bad-faith complainants.
Sections 9 and 10 create two special layers of protection: one for children under 18 (the most vulnerable), and one for large, high-impact companies (the most powerful). Both face stricter rules than the average case.
A child is defined as anyone under 18 years. The key rules:
A 15-year-old wants to join a gaming platform. The platform CANNOT just accept the child's claim of being 18. It must verify the parent's identity and age, and get the parent's explicit consent before creating the account. The parent must be a real, identifiable adult — not just a name typed in a box.
Think of children's data protections like a child-proof cap on medicine bottles. The mechanism is deliberately harder to bypass — not to inconvenience adults, but because the stakes of getting it wrong are so much higher when children are involved.
The government can designate any company as a "Significant Data Fiduciary" based on factors like:
Once designated, these companies must additionally:
| Extra Obligation | What It Involves |
|---|---|
| Data Protection Officer (DPO) | Must appoint an India-based DPO who reports to the Board of Directors and is the point of contact for all data-related matters |
| Independent Data Auditor | An external auditor evaluates compliance with the Act — not self-certified |
| Data Protection Impact Assessment | A formal annual study of risks their processing poses to users' rights |
| Periodic Audit | Annual compliance review and reporting to the Board |
| Algorithm Risk Review | Verify that AI/software used doesn't pose risks to Data Principals (per Rules) |
Who will likely be notified as Significant Data Fiduciaries? Companies like Meta (Instagram, WhatsApp), Google, Amazon, Flipkart, Ola, Paytm, and large banks are widely expected to be notified — though the government hasn't officially published the list yet.
Section 17 lists situations where major parts of the Act simply don't apply. These are not loopholes — they reflect genuine practical realities: law enforcement, national security, research, and startups all need different treatment.
Several obligations (from Chapters II and III and Section 16) do not apply to processing done for:
The entire Act doesn't apply to:
The government can notify certain Data Fiduciaries — including startups — as exempt from several obligations:
Why startups? Requiring a 2-person startup to have the same compliance infrastructure as a Fortune 500 company would kill innovation. The exemption is a proportionate response — lighter obligations for lower-risk, smaller-scale operations.
Your data can be sent outside India — but the Central Government can restrict transfers to specific countries by notification. This means India can block data from going to countries that don't adequately protect Indian citizens' data.
Chapters V–VIII (Sections 18–34) establish the enforcement machinery. A law without enforcement is just advice. The DPDPA creates real institutional muscle — a Board that can summon companies, impose massive fines, and even get websites blocked.
The Board is a statutory body — created by law, with perpetual existence and legal personality. It can sue and be sued in its own name. It has its headquarters wherever the Central Government decides.
Members are selected for their expertise in:
Board members hold office for a 2-year term (renewable). Their compensation and terms cannot be worsened after appointment — protecting their independence.
If you disagree with a Board order, you can appeal to the Appellate Tribunal (the Telecom Disputes Settlement and Appellate Tribunal — TDSAT). The appeal must be:
The Tribunal aims to resolve appeals within 6 months. If it can't, it must record reasons in writing.
If the Board imposes penalties on a company two or more times and advises it's in the public interest, the Central Government can direct any intermediary to block the company's website or app from being accessible in India.
Website blocking is the nuclear option — the data equivalent of losing your operating licence. For a company whose entire business model depends on Indian users (like a social media platform), this could mean billions in losses. It's a powerful deterrent.
The penalty structure is like road traffic fines — calibrated by the danger you caused. Drunk driving (causing a breach through neglect) costs the most. Failing to report an accident you caused is the next tier. Minor infractions cost less. The severity reflects the real-world harm.
Your data belongs to you. The Act's foundational principle: you are the Data Principal — the data is about you, and you have ultimate rights over it.
Consent must be FSIUU. Free, Specific, Informed, Unconditional, Unambiguous. If any of these is missing, consent is invalid.
Data minimisation is mandatory. Companies can only collect the minimum data needed for the stated purpose. Consent for X doesn't give access to Y and Z.
Opt-out must equal opt-in. Withdrawing consent must be as easy as giving it. The law explicitly requires this symmetry.
Companies can't hide behind processors. If your cloud vendor leaks data, the company that hired them is still responsible — full stop.
Breaches must be reported — fast. Notify the Board within 72 hours and affected users immediately. Hiding a breach costs up to ₹200 crore.
Inactivity triggers deletion. If you ghost a platform for the prescribed period, they must automatically delete your data — even without a request.
Children get maximum protection. No tracking, no targeted ads, and mandatory verified parental consent. No exceptions for convenience.
Big companies face bigger duties. Significant Data Fiduciaries must have a DPO, an independent auditor, and annual impact assessments — in addition to all standard obligations.
The Board is the last resort. You must first try to resolve complaints with the company. Only then can you escalate to the Board.
Repeat offenders face the nuclear option. Penalised twice? The government can block your entire platform from being accessible in India.
The maximum penalty is ₹250 crore. For failing to implement security safeguards — a number designed to hurt even the largest companies meaningfully.