In creating the “safe harbors” that protect Internet and online service providers from copyright liability for what their users do, the Digital Millennium Copyright Act spelled out the procedure they must follow for removing infringing content when requested by a copyright owner. Almost from the start, however, that procedure has been a source of friction between the various parties.
Copyright owners must monitor a huge number of platforms for potentially infringing content and send individual takedown notices for each instance of infringement. It requires substantial staff resources to send out the thousands of notices, and it often takes too long for the content to be removed, in their view. Service providers are required to dedicate resources to responding to the notices, as well as to notifying the user who actually posted the content and responding to possible counter-notices. Users often complain that content is taken down that should not have been.
But peace may be breaking out. At the NewTeeVee Video Rights Roundtable Wednesday, representatives from the major studios and platform developers acknowledged a shared interest in automating the process to provide quicker response times and reduce the amount of resources required. Better yet, they said, would be a system that avoided the need to send takedown notices at all, by identifying content before it’s uploaded or as it’s streamed and applying predetermined business rules.
“We want options,” Fox’s content protection counsel Betsy Zedek said. “I have an entire team that does nothing but send takedown notices, but it would be better to have more direct involvement in revenue generating parts of the company.”
Automating the process, however, could create its own set of problems. One of the unresolved issues surrounding the notice-and-takedown process has been its potential to limit unnecessarily the fair use of copyrighted content. Many postings on user-generated sites that incorporate copyrighted content, for instance, could arguable qualify as fair use and therefore are not infringing. The notice-and-takedown process, however, tends to encourage a shoot-first-ask-questions-later approach to possible fair-use content. While both content owners and platform providers have taken steps to minimize the fair-use impact of takedowns — such as employing teams of lawyers to review individual posts before asking to have them removed — automating the process to make it faster could put a squeeze on the breathing space needed to make a fair determination of fair use.
“When we’re out there looking for content [to take down] we’re really looking for full-form content,” Warner Bros.’ Ethan Applen objected when I posed the question at the roundtable. “We have thousands of clips that get uploaded [by users], from Gossip Girl, Vampire Diaries, One Tree Hill, and we mostly let those stay up. Those people are fans of the show.”