The EU and Australia draw up sweeping rules to curb illegal online content. And the ECJ drops a whopper.

Home / Uncategorized / The EU and Australia draw up sweeping rules to curb illegal online content. And the ECJ drops a whopper.

The European Court of Justice: there is no such thing as an innocent “like”. Website operators are liable for data protection issues. More at the end of the post.

Eric De Grasse
Chief Technology Officer

 

30 July 2019 (Brussels, Belgium) – Who said the EU Institutions are on summer break?

The EU is working on a regulation (the Digital Services Act) for how platforms must remove harmful content, following the UK government’s “Online harms” white paper published earlier this year (click here) which is the UK attempt to write a “code of practice” for tech companies to control the spread of illegal or harmful content on the internet, and to stop the undermining of civil discourse.

The pundits think this issue will spread, that it will come from different ideas of what free speech means outside the presumptions made in the U.S. (no one outside America cares what’s in the American constitution), and a different organizational model of regulation (mostly outcome-based rather than the U.S. rules-based model), and there could be “lowest-common-demonimator-effects” — “local” laws will have global consequences.

By default, of course, all of this raises the cost-base of running a UCG (“User Generated Content”) network (which may for example mean you need to get to revenue quicker and grow slower), and entrenches the incumbents, which is a direct conflict with what other regulators in the same building might be working on.

I have a link below to a good summary from the Financial Times (it is behind the FT paywall so I uploaded it to our SlideShare account so you all have access), but here are some highlights from a briefing memo that my partner, Greg Bufithis, wrote for our digital media clients:

The Digital Services Act in a nutshell

The Regulation aims to provide “appropriate incentives to promote fairness and transparency” in order to maintain healthy competition in the ranking of corporate website users by online search engines and across the wider online platform economy.

The Regulation acknowledges the importance of search engines and online platforms in the commercial success of businesses, especially small and medium sized enterprises (Recital 2). It also acknowledges that often these businesses will be commercially dependent upon search engines and online platforms. On this, the Regulation stresses that:

Given that increasing dependence, the providers of those services often have superior bargaining power, which enables them to, in effect, behave unilaterally in a way that can be unfair and that can be harmful to the legitimate interests of their businesses users and, indirectly, also of consumers in the EU. For instance, they might unilaterally impose on business users practices which grossly deviate from good commercial conduct, or are contrary to good faith and fair dealing. This Regulation addresses such potential frictions in the online platform economy.

To this end, the Regulation introduces “measures” and “safeguards” to foster fairness and transparency on the platform economy by requiring that contractual terms and conditions of search engines and online platforms be clear, transparent and drafted in good faith.

The targets of the Regulation are search engines and “online intermediation services”. While we have a pretty good idea what a search engine is, what exactly is an “online intermediation service”?

To be classified as an “online intermediation service” (and be affected by the provision of this Regulation) a platform must meet three cumulative requirements. Thus, the platform or service must, according to Article 2(2) :

(1) be classified as “information society service” information society services within the meaning of EU law (see, Directive 2015/1535, Article 1(1)b);

(2) allow business users to offer goods or services to consumers, with a view to facilitating transactions between those business users and consumers (irrespective of where those transactions are ultimately concluded);

(3) provide services to business users under a contract.

The Regulation casts a broad net over the platform economy. This definition will include a wide range of platforms, ranging from Amazon, to Uber, AirBnB and Fiverr, for example.

Here is a link to the FT article that nicely summaries the issues: (click here)

Meanwhile, Australia is also doing a competitive review of internet platform companies (click here).

Meanwhile, over at the European Court of Justice …

 

On Monday, the Court of Justice of the European Union (ECJ), Europe’s highest court, ruled that websites and other online publishers are liable for data protection issues that arise when they embed third-party plug-ins like Facebook’s “Like” button onto their pages.

The case needs some further analysis but even after a cursory reading you see it is likely to create more privacy headaches for websites, many of which include such plug-ins that collect data on visitors to these sites, which is then shared with tech companies like Google, Twitter and Facebook.

In brief, the case was brought after Fashion ID, a German online retailer, included Facebook’s “Like” button on its site, which sent users’ IP addresses and other personal information to Facebook Ireland, the company’s international headquarters. Verbraucherzentrale, a German consumer protection group, filed a lawsuit to stop Fashion ID from sharing this information over fears that such automatic data collection broke the region’s previous data protection rules.

NOTE: these standards were subsequently upgraded under the General Data Protection Regulation (GDPR) in May, 2018.

The Court opinion says:

“Fashion ID can be considered to be a controller jointly with Facebook Ireland in respect of the operations involving the collection and disclosure by transmission to Facebook Ireland of the data at issue”.

The judges added that websites like Fashion ID must clearly explain to their users how such information is collected through third-party plug-ins and what the data will be used for. These publishers also must obtain consent from individuals before the information is collected and transferred to companies like Facebook.

By stating that publishers are so-called “co-controllers” alongside the makers of these digital plug-ins, they become equally liable for any data protection issues that may arise once people’s personal information is collected through such online tools.

But the judges also said that such liability should be limited to the digital information collected through these plug-ins, and not any other privacy issues that arise if the tech companies like Facebook then use this data for other purposes:

The operator is not, in principle, a controller in respect of the subsequent processing of those data carried out by Facebook alone.

Related Posts