Digital freedom advocacy group the Electronic Frontier Foundation (EFF) has come out against the adoption of a "take down, stay down" system, warning that it will lead to a "filter-everything" approach.
The U.S. Government's Copyright Office is current seeking public consultation on how it can change existing digital copyright laws to satisfy today's needs, and some rights-holders have called for a "take down, stay down" approach to copyright enforcement.
Under the current DMCA laws, it is up to rights-holders to provide URLs to Internet service provider such as Google for removal. The same piece of content, such as a unique music track, often has multiple URLs, sometimes thousands of them, but under the current approach rights-holders have to submit each URL for service providers to remove.
While rights-holders can submit multiple URLs per DMCA take-down requests, and despite Google removing more than 1,500 URLs per minute, new URLs will usually pop-up faster than they can be taken down. Is this because of this, rights-holders have called for a new approach.
Under the "take down, stay down" approach, rights-holders only have to identify the unique piece of work (Justin Bieber's 'Love Yourself', for example), and it's up to service providers to search and destroy all URLs, new and old, for the content.
With good reason, service providers are wary of this approach, as they predict this will make them ultimately responsible for policing content, and at their own expense. The EFF agrees with this prediction.
"Filter-everything would effectively shift the burden of policing copyright infringement to the platforms themselves, undermining the purpose of the safe harbor in the first place," the EFF's Elliot Harmon warns.
If service providers are tasked with "take down, stay down", automated scanning tools will be a necessity to deal with the millions of pieces content and even more URLs that needs to be processed. This, the EFF says, is also a red flag, as automated tools, or "copyright bots", are notorious for being inaccurate.
“Here’s something else to consider about copyright bots: they’re not very good. Content ID routinely flags videos as infringement that don’t copy from another work at all. Bots also don’t understand the complexities of fair use. In September, a federal appeals court confirmed that copyright holders must consider fair use before sending a takedown notice. Under the filter-everything approach, legitimate uses of works wouldn’t get the reasonable consideration they deserve. Even if content-recognizing technology were airtight, computers would still not be able to consider a work’s fair use status." Harmon adds.
The biggest worry, according to Harmon, is how such a system might be abused by rights-holders to silence critics en masse.
"You don’t need to look far to find examples of copyright holders abusing the system, silencing speech with dubious copyright claims," says Harmon.
The U.S. Government's Copyright Office is current seeking public consultation on how it can change existing digital copyright laws to satisfy today's needs, and some rights-holders have called for a "take down, stay down" approach to copyright enforcement.
Under the current DMCA laws, it is up to rights-holders to provide URLs to Internet service provider such as Google for removal. The same piece of content, such as a unique music track, often has multiple URLs, sometimes thousands of them, but under the current approach rights-holders have to submit each URL for service providers to remove.
While rights-holders can submit multiple URLs per DMCA take-down requests, and despite Google removing more than 1,500 URLs per minute, new URLs will usually pop-up faster than they can be taken down. Is this because of this, rights-holders have called for a new approach.
Under the "take down, stay down" approach, rights-holders only have to identify the unique piece of work (Justin Bieber's 'Love Yourself', for example), and it's up to service providers to search and destroy all URLs, new and old, for the content.
With good reason, service providers are wary of this approach, as they predict this will make them ultimately responsible for policing content, and at their own expense. The EFF agrees with this prediction.
"Filter-everything would effectively shift the burden of policing copyright infringement to the platforms themselves, undermining the purpose of the safe harbor in the first place," the EFF's Elliot Harmon warns.
If service providers are tasked with "take down, stay down", automated scanning tools will be a necessity to deal with the millions of pieces content and even more URLs that needs to be processed. This, the EFF says, is also a red flag, as automated tools, or "copyright bots", are notorious for being inaccurate.
“Here’s something else to consider about copyright bots: they’re not very good. Content ID routinely flags videos as infringement that don’t copy from another work at all. Bots also don’t understand the complexities of fair use. In September, a federal appeals court confirmed that copyright holders must consider fair use before sending a takedown notice. Under the filter-everything approach, legitimate uses of works wouldn’t get the reasonable consideration they deserve. Even if content-recognizing technology were airtight, computers would still not be able to consider a work’s fair use status." Harmon adds.
The biggest worry, according to Harmon, is how such a system might be abused by rights-holders to silence critics en masse.
"You don’t need to look far to find examples of copyright holders abusing the system, silencing speech with dubious copyright claims," says Harmon.