Although Threads, Meta’s envisioned Twitter competitor, is not yet publicly accessible, it already appears to be a privacy nightmare.
The app may gather highly sensitive information about users in order to profile their digital activity, including health and financial data, precise location, browsing history, contacts, search history, and other sensitive information, according to information provided about the app’s privacy via mandatory disclosures required by iOS.
This is hardly surprising considering that Meta, the business that created the app and was once known as Facebook, makes money by tracking and profiling online users in order to sell their attention through its behavioral advertising microtargeting technologies. It does, however, raise concerns about whether Threads will be allowed to launch in the EU, since the legal justification Meta had offered for processing Facebook users’ personal data (fulfillment of a contract) was rejected at the beginning of this year.
Since then, Meta has shifted to asserting a legitimate interest in processing this data for advertisements. But earlier this week, the European Union’s highest court added to Meta’s regional woes with a ruling on a German case referral, saying that this legal basis is not suitable for running Meta’s behavioral ads either and that agreement must be obtained. Sensitive information, such as health information, is subject to an even higher degree of affirmative agreement under current EU law in order to be handled legally and in compliance with the General Data Protection Regulation. Therefore, Meta would need to inquire about and gain particular consent before processing delicate data like health information.
Additionally, upcoming EU rules (see: the Digital Services Act and Digital Markets Act) may need specific consent from tech giants before they may combine data for ad profiling or outright prohibit the use of sensitive data for ads. Therefore, there is even more local legal uncertainty in the works for Meta’s people-farming enterprise. While so-called very large online platforms must comply with the DSA’s requirements by August 25, designated gatekeepers must be in compliance with the DMA by next spring.
The adtech behemoth currently does not even provide customers a basic, upfront option to reject its tracking and profiling, let alone directly ask if it can share data on your health conditions so advertisers can attempt to sell you diet pills or anything. Additionally, regional authorities will find it difficult to convince them to approve an app that promises to track everything to increase its appeal to advertisers because even stricter restrictions on surveillance ads are on the way in the EU.
In addition, Meta was recently ordered to stop transmitting the data of EU users to the US for processing and fined about $1.3 billion for violating the GDPR’s rules on data exports. That directive is particular to Facebook, but in theory, other Meta services that don’t effectively protect the data of Europeans as it crosses the Atlantic could be subject to the same obligation (for example, by employing zero knowledge architecture end-to-end encryption). And it is obvious that Threads won’t provide users with that level of privacy.
Given that Threads is presenting more of the same data-grabbing attention farming that has given Mark Zuckerberg’s empire such a toxic reputation that it had to undergo an expensive corporate rebrand to Meta in recent years, it appears that bringing Meta’s surveillance ads business into compliance with EU law will require a sea-change in how it operates.
Given that it chose to associate Threads with Instagram’s brand rather than explicitly calling it a Meta app (the developer listed on the App Store is “Instagram Inc.,” and the text description refers to the app as “Instagram’s text-based conversation app,” it is questionable whether the rebranding has been successful in cleaning up Meta’s corporate image). However, Meta may have made that decision more so in order to convince Instagram’s sizable and active fanbase to swiftly adopt what it is positioning as a sister “text” app so the latter can get off the ground.
The Irish DPC, the principal regional data protection supervisor for Meta, told the Irish Independent yesterday that it had spoken to Meta about the service and that it wouldn’t debut “at this point” in the EU.
The business delayed an EU launch of Threads due to legal uncertainties surrounding data use related to the aforementioned DMA’s limitations on sharing user data across other platforms, according to a report in today’s Guardian, which cited sources inside Meta.
When we inquired about whether Meta intended to release Threads in the EU or not, a spokeswoman for the company did not react.
Therefore, it appears that no active regulatory intervention has been made to prevent a launch at this time. Instead, Meta seems worried about the potential legal danger it would run if it launched while still being subject to the DMA in a few months. (Earlier this week the corporation alerted the EU it believed the forthcoming ex ante antitrust regime does apply to its business — although compliance isn’t required until six months after the official EU gatekeeper designations).
Instead of being implemented locally by agencies at the Member State level like the Irish DPC, the new legislation will be enforced centrally by the European Commission. Expectations are for the EU to shift its focus from enforcing laws against internet giants to enforcing them against small businesses like Meta.
Notably, Threads is scheduled to start in the U.K. on Thursday, where the regulatory environment is different because the market there is no longer subject to EU law in the wake of the Brexit decision.
Technically, the GDPR’s legal rules regarding the processing of personal data also apply in the U.K. because of the present data protection regime there. The ICO, the nation’s watchdog for data privacy, hasn’t done much to address widespread violations in the surveillance advertising sector, though. Therefore, Meta might feel at ease with the level of legal risk that its company confronts in Britain after Brexit. Although the U.K. government has brought back a plan to pass its own ex ante antitrust reform targeting at digital giants, it will probably be years before it has legislation similar to the EU’s DMA on the books.
In a post-Brexit data reform bill, the U.K. government has also hinted at a plan to weaken domestic data protection laws. This move could further erode the ICO’s independence and render the watchdog even less effective than it already is at stopping data protection violations.
While this was going on, Meta was fined over $410 million in the EU in January for running behavioral advertisements on Facebook and Instagram without a valid legal justification under the GDPR. This is only the most recent in a line of significant fines it has received for violating the GDPR. Contrarily, the last time the ICO penalized Meta, the business was still known as Facebook, and the penalty came in the wake of the Cambridge Analytics incident.
The theoretical maximum that DPAs can penalize data controllers for violations of the GDPR is just 4%; under the DMA, centrally enforced sanctions can scale up to 10% of global annual revenue.
In actuality, particularly in the case of Meta, fines imposed on computer firms found to have violated the EU’s data protection regulation have remained a small portion of the maximum.