Notice and take down


Notice and take down is a process operated by online hosts in response to court orders or allegations that content is illegal. Content is removed by the host following notice. Notice and take down is widely operated in relation to copyright infringement, as well as for libel and other illegal content. In United States and European Union law, notice and takedown is mandated as part of limited liability, or safe harbour, provisions for online hosts. As a condition for limited liability online hosts must expeditiously remove or disable access to content they host when they are notified of the alleged illegality.

United States

The Online Copyright Infringement Liability Limitation Act, passed into law in 1998 as part of the Digital Millennium Copyright Act provides safe harbour protection to "online service providers" for "online storage" in section 512. Section 512 applies to online service providers that store copyright infringing material. In addition to the two general requirements that online service providers comply with standard technical measures and remove repeat infringers, section 512 do not receive a financial benefit directly attributable to the infringing activity, 2) are not aware of the presence of infringing material or know any facts or circumstances that would make infringing material apparent, and 3) upon receiving notice from copyright owners or their agents, act expeditiously to remove the allegedly copyright infringing material.
An online service provider can be notified through the copyright owner's written notification of claimed infringement. Section 512 lists a number of requirements the notification must comply with, including:
Provided the notification complies with the requirements of Section 512, the online service provider must expeditiously remove or disable access to the allegedly infringing material, otherwise the provider loses its safe harbour and is exposed to possible liability. Following this the online service provider must take reasonable steps to promptly notify the alleged infringer of the action. If there is a counter notification from the alleged infringer, the online service provider must then promptly notify the claiming party of the individual's objection. If the copyright owner does not bring a lawsuit in district court within 14 days, the service provider is then required to restore the material to its location on its network. If the service provider receives a counter notice claiming that the material does not infringe copyrights, the service provider must also comply with requirements set out in Section 512, including:
If the court determines that the copyright owner misrepresented the claim of copyright infringement, the copyright owner becomes liable for any damages that resulted to the online service provider from the improper removal of the material. The online service provider is also required to appropriately respond to "repeat infringers", including termination of online accounts. On this basis online service providers may insert clauses into user service agreements which allow them to terminate or disable user accounts following repeat infringement of copyright. Identification of "repeat infringer" may occur through repeated notice and takedown requests, while other online service provider require a determination by a court.

European Union

The basis for notice and takedown procedures under EU law is article 14 of the Electronic Commerce Directive, adopted in 2000. Article 14 applies to content hosts in relation to all "illegal activity or information". Online hosts are not liable for the illegal activity, or information placed on its systems by a user, so long as the online host does not have "actual knowledge" of the activity or information. Upon obtaining such knowledge, the online host must act expeditiously to remove or to disable access to the information. The Directive does not set out notice and takedown procedures but it envisaged the development of such a process because online hosts who fail to act expeditiously upon notification lose limited liability protection. The Directive suggests that voluntary agreements between trade bodies and consumer associations could specify notice and takedown processes, and that such initiatives should be encouraged by member states.
In most EU countries at the national level, there are no explicit rules regarding notice of infringement, take-down process or counter notice and put back. Where explicit rules do not exist, some aspects of notice requirements can be derived from common principles of law. By nature, this lack of explicit rules results in a lack of clarity and legal certainty when compared to legal regimes with statutory rules.
In October 2013, the European Court of Human Rights ruled in the Delfi AS v. Estonia case that the Estonian news website Delfi was liable for defamatory comments by users in an article. The court stated that the company "should have expected offensive posts, and exercised an extra degree of caution so as to avoid being held liable for damage to an individual’s reputation" and its notice and take down comments moderation system was "insufficient for preventing harm being cause to third parties".

India

In India takedown requests can happen through Section 69A of Information Technology Act, 2000.

Criticism

Notice and takedown has been criticised for over-blocking or take down of non-infringing content. In 2001 the Electronic Frontier Foundation launched a collaborative clearinghouse for notice and takedown requests, known as Chilling Effects. Researchers have been using the clearinghouse to study the use of cease-and-desist demands, primarily looking at DMCA 512 takedown notices, but also non-DMCA copyright issues, and trademark claims. A 2005 study into the DMCA notice and take down process by Jennifer Urban and Laura Quilter from the Samuelson Law, Technology and Public Policy Clinic concluded that "some notices are sent in order to accomplish the paradigmatic goal of 512 – the inexpensive takedown of clearly infringing hosted content or links to infringing web sites". However, on the basis of data on such notices the study concluded that the DMCA notice and take down process "is commonly used for other purposes: to create leverage in a competitive marketplace, to protect rights not given by copyright, and to stifle criticism, commentary and fair use". However, it is misleading to conclude that these problems do not arise under the E-Commerce Directive, which does not provide for a statutory notice and take-down procedure, since these chilling effects are a specific problem of provider liability as such.
In 2007 numerous US based online service providers hosting user generated content implemented content recognition technology to screen uploaded content for possible copyright infringement. These content ID systems, such as operated by YouTube, are outside the Digital Millennium Copyright Act mandated notice and takedown process. The Electronic Frontier Foundation, along with other civil society organisations published principles on user generated content, calling for the protection of legitimate use of copyright protected works, prior notification of the uploader before removal or the placement of ads on the content, use of the DMCA counter notice system, including reinstatement upon counter note and the failure of the copyright owner to bring a lawsuit.
The Electronic Commerce Directive, unlike the Digital Millennium Copyright Act, did not define so called notice and action procedures under article 14 of the Directive. Member states implemented diverging approaches on the duty to act expeditiously and on when an online host obtains "actual knowledge" in relation to notifications. Inconsistent approaches to whether online service providers, such as search engines or social media networks, fall within the definition of online host, under article 14 developed across the EU. As a result, notice and takedown procedures are fragmented across EU member states and online hosts face considerable legal uncertainty. The European Commission consulted on notice and action procedures under article 14 in 2010, and has launched a new initiative in June 2012. The European Commission observed that "Online intermediaries face high compliance costs and legal uncertainty because they typically have operations across Europe, but the basic rules of Article 14 are interpreted in different ways by different national courts." As part of the initiative the European Commission intends to clarify which online service providers fall within the article 14 definition of online hosts. The initiative assesses whether different categories of illegal content require different notice and action approaches.
It seems that in 2013 the European Commission's notice and action initiative has come to a halt. The reason for this is unclear. One aspect might be to avoid bad publicity, since notice and take down is associated with chilling effects on free speech as described above. The other reason might be the following problem: the EU Commission already made it quite clear that it does not want to change the Electronic Commerce Directive – while indeed it seems impossible to provide legal certainty in the take down process without a binding legal underpinning.

Notice and stay down

The term notice and stay down is used to refer to the concept of additionally requiring that a service, after it has received a request to take down a certain copyrighted work, must also prevent the same work from becoming available on the service again in the future. Proposals for such concepts typically prescribe the implementation of automatic content recognition, similar to YouTube's "Content ID" system, that would proactively filter identified works and prevent them from being re-uploaded. Proposals for notice and stay down rules have been made in the United States by pro-copyright lobbyists, and constitute Article 17 of the EU's Directive on Copyright in the Digital Single Market.
The concept of notice and stay down has faced criticism; it has been noted that the only way to reliably enforce such an obligation would be through automatic filtering, which is subject to the possibility of false positives, and the inability to detect lawful uses of an affected work. The Electronic Frontier Foundation argued that requiring proactive monitoring of user content would place the burden of copyright enforcement on service providers, and would be too costly for newly-established companies.