India’s Privacy Law: The Digital Personal Data Protection Act, 2023 (DPDPA) stirred the Indian community. Hailed as the most consumer-centric legislation, DPDPA regulated the previously unregulated business practice of collecting user data. However, it left quite a few loopholes that businesses can exploit easily. Not only that, but legislation is also far from ideal. This article discusses the issues with DPDPA and how it is miles away from being a consumer-friendly legislation, let alone an ideal privacy law.
Also Read: Features of India’s New Privacy Law
The Notice Problem:
If a Company wants to collect data, once the Act comes into force, the Data Fiduciary (the company that wants to collect data) can only collect such data after requesting the person’s consent and gaining such consent.
The Act dictates that a ‘request for consent’ should be made in the form of a notice, and it must contain the following:
The personal data sought and the purpose for which it will be used. How a person can exercise their rights & how a person can complain to the Data Protection Board (hereinafter, the Board).
If a Data Fiduciary is already collecting data on or before the day the Act was implemented, then a new request for consent should be made by the Data Fiduciary “as soon as possible”.
Because there is no time set within which the existing companies should provide the Request for consent notice, there is a possibility that companies / Data Fiduciaries can continue collecting data without giving a new request for consent under the pretence of it not being possible just yet.
This becomes even more problematic if they are companies with significant market power and lax privacy policies – as they can continue to collect excessive amounts of data, not inform the consumers as to the breadth of their data that is collected and the specific purpose that they are using such data for, build a protective moat around their market position with such data and increase barriers to entry for any new competitor.
Consequently, this creates an imbalance in competition because it affects the ability of new entrants to penetrate the market while an established player collects excessive data without accountability.
The Consent Problem:
The Act requires the Consent provided by the Data Principal to be free, specific, informed, unambiguous, unconditional and with a clear informative action (not a passive action). It also states that a request for consent has to be made by the Data Fiduciary, once after the Act is implemented.
It nowhere mentions or defines ‘free’ & ‘informed’ – In an era of network effects, where users are joining a specific platform because their friends are present there, and in the era of skipping to ‘check’ the box next to “I agree to the terms & conditions”, the concepts of ‘free’ and ‘informed’ are in themselves questionable.
It also does not discuss what needs to happen if the Data Fiduciary updates the privacy policy after giving the initial Request for consent notice.
It requires clear affirmative action only for the first time, not when the policy is updated later. Because passive acceptance (where users accept the terms and conditions by their continuous use) is allowed for when the policy is updated, consumers, more than businesses, will lose – it will enable businesses to put up a front for the first time by appearing benevolent and then updating the privacy policy to continue collecting excess and unreasonable amounts of information about the user.
Passive acceptance is not a new concept involving digital markets, and neither is a business collecting data on the users and using it to “increase” users’ resistance to switching to other products.
The Purpose of the Collection Problem:
A Data Fiduciary is a person who determines the purpose & means of processing personal data, either by themselves or “in conjunction with others”.
The Act may have intended the words “in conjunction” to mean marketing agencies or business developers that assist the business in its development. But as nowhere mentioned what it means by “in conjunction with others”, an extremely worrying eventuality is plausible.
What if a business decides its privacy policy in conjunction with its competitors?
In a world where competition law still does not fully recognise data abuse as an anti-competitive practice, DPDPA presents a whole new possibility of a Cartel in terms of data collection.
Businesses situated at the same level (basically, competitors) could agree to collect the same type of data and the exact amounts of excessive and invasive data that can influence the user in the way they wish. The consumers, at the receiving end, will have no other option but to agree to the terms that the Cartel decided. All of a sudden everyone in the industry is facing a situation where their data is collected for purposes not specified – and it’s the same with other competitors.
This is likely in industries with few players, like advertising services, social media/communication services, web browsers, operating systems, e-commerce services or online payment services, etc.
The Necessity Problem:
DPDPA requires data fiduciaries to collect personal data as necessary for specified purposes. The data collection should be limited to the purpose for which they need the data to be collected.
On top of the fact that Data Fiduciaries can determine the purpose in conjunction with other Data Fiduciaries, this creates another issue. The Act leaves Data Fiduciaries to determine what data is ‘necessary’ to attain the specific purpose and what is not.
An online pharmacy requiring device informaiton (ex, the current operating system, the device ID, when the device is accessed etc.) is unnecessary for the provision of services. Similarly, a video streaming application collecting personal information like phone numbers and requesting access to the entire media library present in the user’s mobile is unnecessary. In the same way, a digital commerce platform accessing the location service on mobile and on a laptop 24/7 is unnecessary & borderline excessive. But because the Companies determine what is necessary and what is not, anything that a company deems necessary to itself is legal, in the eyes of the law.
The recourse available for people is to approach the Board and prove that the data collected is unnecessary for the specific purpose. If the Board shares the opinion, the Board can hit a Company with a 50 Crore penalty. But that is a big ‘if’.
The Children’s Data Problem:
A Data Fiduciary can collect data from a child (classified as less than 18 years old by the Act) after obtaining verifiable permission from the legal guardian. The Act imposes two conditions on the Data fiduciary that collects such data – they should not process/use the data in a way that is ‘likely’ to cause a ‘detrimental’ effect on the ‘well-being’ of the child & they should not undertake tracking or behavioural monitoring of children or provide targeted advertisements to children.
While they are certainly good and welcoming conditions, they create a BIG jurisdiction issue between the Competition Commission of India (CCI) and the Board.
This is because the question – whether a company’s activity while competing with its competitors, with the data and other resources it possesses, is unfair to the market or not – falls under the direct ambit of Competition Authorities worldwide.
DPDPA, however, takes it upon itself to determine whether a particular activity committed by the company, using the data it collected, is detrimental to the welfare of a child. The problem with this idea is that an authority – not a competition authority – recognises that an activity – committed by the company while competing to win the competition – is detrimental to the consumers. All of this is compounded by the fact that the DPDPA offers no set criteria to determine what kinds of activities are detrimental and what are not.
First, the way the Data Protection Board would analyse whether a specific activity is detrimental may not be the same way the CCI assess abuse to the children/consumers caused by that particular activity of the company. And because there is a difference in the way a specific activity by the company is assessed, every company is thrown in limbo with regards to whether they are committing abuse or if they are detrimentally affecting the children or both! What does a business do in this situation?
Second, any activity that is considered abuse by the CCI is sure to be detrimental to the children. Doesn’t that result in separate cases in front of the CCI & the Board for the same activity? Doesn’t it mean double jeopardy?
Third, any activity considered detrimental by the Board may or may not be regarded as an abuse by the CCI when it applies the principles of Competition law to such activity, and it is likely that such activity may have a very valid business justification. How to reconcile with this particular dilemma?
Fourth, Suppose the Board determines a particular activity to be detrimental to children, i.e. people who are less than 18 years old – what if the same activity is being undertaken by the company against 19 years old?
Because Competition authorities, while assessing abuse, can see beyond the factor of age to determine the abuse caused by the companies, shouldn’t they be the persons to consider this issue? Is it a situation where a user is more than 18 years of age, they are assessed through competition law, and if they are less than 18 years old, they are evaluated by DPDPA? Doesn’t that defeat the purpose of the Competition Law, to a certain extent?
What DPDPA proposes vis-à-vis this issue has the potential to change the understanding of abuse and the role of the Competition Authority – CCI.
Even though the intent behind the law is to make sure no Data Fiduciary uses data that they collect in a way detrimental to the children, because Competition Law is also concerned with the same issue, a challenge to the jurisdiction of CCI inevitably arises.
The middle ground to read these two laws is the market power that a company/data fiduciary has. If the case is against a dominant company with a significant market power, then the CCI will have the final say, if not, the Board. However, until CCI fully formalises the data abuses that can be caused (by first implementing the Digital Competition Act) and until there is clarity on the issues mentioned, this problem will not go anywhere.
The Consent Withdrawal Problem:
The Act states that data fiduciaries should allow the withdrawal of consent as easily as data principals could provide consent for data collection.
The name of the business is retention of the consumer. It is in the business’s interest to make things as appealing as possible to the consumers so that they do NOT quit using the business services.
That’s why chocolates/candy are kept next to the check-out counter. Imagine entering and exiting the store and not seeing anything appealing even while getting out of the store. The consumer experience is rightfully affected.
In other words, it is in the interest of the business to make it a bit difficult for the consumer to get out of the ecosystem. That is, inherently, not a problem – That’s just how businesses work. No one wants to lose a consumer, which is the business justification for why things are how they are.
Certainly, if the users are facing a system where withdrawing their consent / getting out of the system is impossible, then yes, that kind of system is inherently problematic. Whether a particular system is difficult to get out of or not – because of which consumers cannot choose the goods and services the others provide – is a competition issue.
But because the Board will be empowered to assess this particular type of case as well, a question needs to be asked – Looking at how easy it is for a consumer to provide consent when signing up for a particular service, literally checking a box, can a case be brought against any Data Fiduciary for not having the same exact ease in withdrawing the consent?
If the provision is considered objectively, there is a possibility for a consumer to bring a case against a Data Fiduciary for NOT having the same ease while withdrawing the consent.
Because this is the Board that assesses the case, this leads to the bigger question: Will the Board consider the business justification (to retain consumers) while deciding the case? If the Board doesn’t, then the result is straightforward – a penalty of 50 Crores for violating the provisions of the Act.
The Data Deletion Problem:
DPDP requires the consent provided by the person (Data Principal in the Act), to be free, unconditional, informed, specific, and unambiguous. It also says that the Data Principal can, at any time, withdraw such consent to provide the data. Once the Data Principal withdraws the consent, the Data Fiduciary must delete the data within a “reasonable time”.
Barring situations where any other law determines how long a business can hold specific data, it falls in the hands of the Board to decide what amounts to “reasonable time”. And this impacts competition significantly.
Take, for example, dating apps and other apps that often embed Unique device identifier (UID) in the user’s device when they first use the service. And they often fail to remove them from their database even after the user deletes their profiles / withdraws their consent.
Suppose the user re-installs the app or re-creates the profile on the platform again: Because the device is already registered in the database (& the UID info is not deleted), it allows dating apps to see the extent of time that has passed since the deletion of the profile and the re-creation of it, to assess how desperate/needy a particular person is for the sake of meeting someone.
This, in turn, allows them to intentionally suppress the visibility of that user’s profile and “nudge” them to take a premium subscription to get more visibility.
The Significant Data Fiduciary Problem:
DPDP recognises the market situation where certain companies collect large amounts of user data. It brought in the idea of Significant Data Fiduciaries (SDF) to require these companies to be more responsible with their data collection and processing.
The Central Government will determine and recognise SDFs based on factors like Volume and sensitivity of data processed, risk to electoral democracy, risk to data principal rights, Security of the State and Public Order. Once recognised, the SDFs must appoint a data auditor and a Data Protection Officer and undertake a Data Protection Impact assessment.
Keeping aside the inherent problems that emanate (like how Data Auditor is not defined at all in the entire Act), this presents an issue of concern for the consumers, from a competition perspective.
Last year, the Committee on Digital Competition Law submitted a draft of the Digital Competition Act detailing how companies that provide digital services should compete. In that, they put forth certain criteria to recognise significantly powerful companies with huge consumer & business bases as Systematically Significant Digital Enterprises (SSDE). This is done to make sure these companies act fairly, transparently, and non-discriminately.
The factors which are relied upon for determining a company to be a SSDE are: the volume of transactions the company does, the advantage it derives because of the volume of data it collects, the total revenue, and the total no. of people (more than 1 Crore consumers and 10,000 businesses) using the Company’s services, etc.
If the proposed Digital Competition Act becomes a law and is implemented after DPDP is implemented, there is a predicament that cannot be ignored.
A company which has significant business operations, which has a significant consumer base and collects massive amounts of data could be considered as an SSDE, per the Digital Competition Act. Still, it may not be considered a Significant Digital Enterprise in the eyes of DPDP.
Because of that, even though it deals with a large volume of data, it does not need to appoint a Data Auditor or a Data Protection Officer, nor is it required to undertake a Data Protection Impact Assessment.
This is even more complicated because an SSDE is recognised by the CCI, whereas an SDF is only identified by the Central Government.
It’s a rare sight for a digital company that collects data to be significant in terms of its operations and NOT to be significant in terms of the data it collects…
Conclusion:
In conclusion, while the Digital Personal Data Protection Act (DPDPA) marks a significant step towards establishing a data privacy framework in India, it falls short of being an ideal law, particularly concerning its impact on competition and consumer welfare. Numerous loopholes and ambiguities within the Act raise concerns about its effectiveness in protecting individual privacy and ensuring fair market practices.
The lack of clarity on key concepts like ‘notice,’ ‘consent,’ and ‘necessity’ leaves room for exploitation by data fiduciaries, potentially leading to excessive data collection and anti-competitive behavior. The DPDPA’s provisions on children’s data and consent withdrawal raise further concerns about potential harm and jurisdictional conflicts with the Competition Commission of India. Moreover, the absence of a clear definition for ‘significant data fiduciaries’ and the potential discrepancies with the proposed Digital Competition Act create additional uncertainties.
To address these shortcomings, the DPDPA requires further refinement and clarification. A more robust framework that balances the interests of businesses and consumers is essential. This could involve establishing clearer guidelines for data collection, processing, and storage, as well as strengthening enforcement mechanisms to ensure compliance. Moreover, greater collaboration between the Data Protection Board and the Competition Commission of India is crucial to address the complex interplay between data privacy and competition in the digital age.
Ultimately, the success of the DPDPA will depend on its ability to adapt to the evolving digital landscape and effectively safeguard the privacy rights of individuals while fostering a healthy and competitive market environment. It is imperative that policymakers, regulators, and businesses work together to address the concerns raised in this analysis and ensure that the DPDPA truly serves the interests of all stakeholders.