In pretty much any other context i would have probably ignored this, but hey this is the highest court in the EU which instills respect in me (although this is of course seriously ridiculous in times of pandemic and they should live stream their all of their proceedings)

In pretty much any other context i would have probably ignored this, but hey this is the highest court in the EU which instills respect in me (although this is of course seriously ridiculous in times of pandemic and they should live stream their all of their proceedings)

<note> The notes below are a cleaned up version of the notes i took during the hearing on the 10th of November in Luxembourg from the press room. I consider them pretty accurate but they are definitely not good enough to be quoted in anything serious. I have the least confidence in the the parts dealing with interventions by Spain and the Council (who was represented by a Spanish speaking lawyer). In both cases the interpreter struggled to keep up with the speed of the spoken interventions and as a result some of the meaning likely got lost or mixed up. <end of note>

Notes:

Nothing really surprising happens, all parties behaved more or less as expected. I would say that the whole hearing was more about the commission guidance than about the polish request to annul 17(4)b+c. 75% of the time was spent on discussing questions related to the guidance

Republic of Poland

The polish government argued its case which essentially rests on the observation that while not explicitly mandating Articles 17(4)b+c effectively require platforms to implement upload filters because there are no other means to comply. Argues that commission guidance merely “waters down” the directive without fixing fundamental flaws.

PL also notes that the issue at stake does have implications well beyond the EU. Notes that the US copyright office concluded in its 501c hearing that the US should wait to see if the DSM directive has negative consequences before adopting similar measures. PL concludes that the EU must not become a testing ground for limiting the freedom of expression online.

European Parliament

European Parliament clearly did not want to be dragged into this fight. Their basic argument was that if Art 17 limits fundamental rights depends on specific implementations and since there are no implementations yet they cannot say anything meaningful yet. Also there is a proportionality requirement so everything will be just fine!

EP also noted that they expect a lot of technological innovation and so it is good that the directive is worded in a neutral and abstract way. EP thinks that the commission draft guidance is not a valid reference, they want to limit themselves to discussing provisions of the directive.

EP: art 17(4) is irreversibly linked to 17(7). The mutual interference of best efforts obligation (4) and results obligation (7) has been "designed that one fundamental right may not be used to limit the other". This will ensure that there will be no interference with fundamental rights because of Art 17.

EP: deliberations that the application of Art 17 will limit fundamental rights are "merely hypothetical in nature". Mistakes will be made but the fact that there are human errors in application does not mean that the directive is problematic.

Council

Council: Important to recollect that Art17 is about OCSSPs (everybody calls tem platforms in the hearing) and that they are different from ISPs Platforms arrange content to make it attractive so they can make a profit (while ISPs are like the post office)

Council: PL is getting things wrong. Art17 has nothing to do with prescribing filters. 17(4) simply sets out a number of conditions for platforms to avoid liability. Key para is 17(1) which makes platforms responsible vis a vis rightholders for communication to the public. Restores order on the internet.

Council: in this context "platforms may feel compelled to use filters''. But they are doing this already for internal analytics, to block content they don't like (nudes) to block illegal content (CSAM) and to manage copyright (ContentID). Art17 does not prevent users from sharing content, it just creates liability for platforms if they do. Art 17 does not require use of specific technologies to avoid liability; it merely sets an objective. Up to platforms to find specific implementation.

Council in response to Q1 (are there technologies other than upload filters that can effectively implement 17(4)b?): Currently this can be done by a combination of Automated Content Recognition, AI and human review. Systems can be good but will be imperfect. They will be circumvented but platforms will not be liable for circumvention

Example: if a user manages to circumvent an audio filter by uploading a slowed down work and then uses platform provided tools to speed the track up again, the platform will not be liable. But the platform will be expected to fix this loophole and will be liable if it does not.

Council in response to Q3 (does Art 17 allow a system in which only manifestly infringing content is automatically blocked and all other content needs to stay online until it has been determined to be infringing?): Manifestly infringing standard is not contained in the directive. Also copyright is complicated and uses of very small parts of a work can be infringing (points to Pelham case)

Council: Art 17(4)b only requires blocking if right holders have identified works to be blocked and if they have ascertained that the use of the work is infringing. 17(7) imposes a clear and strict obligation not to remove non-infringing content on providers. This is an obligation of result and as such it is stronger than best effort obligation in 17(4). of course there can be really difficult cases (this is nothing new in copyright) and platforms can make the wrong call. That is what the complaint and redress mechanism in 17(9) is for.