|Title:||'Who Watches the Watchmen?' An Empirical Analysis of Errors in DMCA Takedown Notices|
|Citation:||Seng, D. (2015). 'Who Watches the Watchmen?'An Empirical Analysis of Errors in DMCA Takedown Notices.|
|Key Related Studies:|
|Linked by:||Fiala and Husovec (2018), Seng (2015b)|
|About the Data|
|Data Description:||Two main datasets form the basis of this study:
|Data Type:||Primary and Secondary data|
|Secondary Data Sources:|
|Data Collection Methods:|
|Data Analysis Methods:|
|Cross Country Study?:||No|
|Government or policy study?:||No|
|Time Period(s) of Collection:||
Under the Digital Millennium Copyright Act (DMCA) takedown system, to request for the takedown of infringing content, content providers and agents issuing takedown notice are required to identify the infringed work and the infringing material, and attest to the accuracy of such information and their authority to act on behalf of the copyright owner. Online service providers are required to evaluate such notices for their effectiveness and compliance before successfully acting on them. To this end, Google and Twitter as service providers are claiming very different successful takedown rates. There is also anecdotal evidence that many of these successful takedowns are "abusive" as they do not contain legitimate complaints of copyright or erroneously target legitimate content sites. This paper seeks to answer these questions by systematically examining the issue of errors in takedown notices. By parsing each individual notice in the dataset of half a million takedown notices and more than fifty million takedown requests served on Google up to 2012, this paper identifies the various types of errors made by content providers and their agents when issuing takedown notices, and the various notices which were erroneously responded to by Google. The paper finds in that up to 8.4% of all successfully-processed requests in the dataset had "technical" errors, and that additionally, at least 1.4% of all successfully-processed requests had some "substantive" errors. As all these errors are avoidable at little or no cost, this paper proposes changes to the DMCA that would improve the takedown system. By strengthening the attestation requirements of notices, subjecting notice senders to penalties for submitting notices with unambiguously substantive errors and clarifying the responsibilities of service providers in response to non-compliant notices, the takedown system will remain a fast, efficient and nuanced system that balances the diverse interests of content providers, service providers and the Internet community at large.
Main Results of the Study
The man in findings of this study are:* After parsing each notice in the dataset of half a million takedown notices and more than fifty million takedown requests served on Google up to 2012, this paper finds that almost all notices comply with the non-functional formalities. * However, 8.3% of all takedown notices in 2012 fail to comply with the functional formalities. In addition, at least 1.3% of the takedown requests exhibit “substantive” errors that misidentify the copyright owner or provide inactive URIs as takedown requests. * To ensure that the takedown system remains fast, efficient and error-free, this paper proposes to strengthen the attestation requirements of notices, to require reporters to validate all submitted takedown requests, and to subject recalcitrant reporters to the “slow lane” of a two-tier system for processing takedown notices.
Policy Implications as Stated By Author
The study makes 3 explicit policy recommendations:* "The first proposal is to make a slight change to the existing language of the DMCA to require a reporter issuing the takedown notice, under penalty of perjury, to attest to the accuracy of the information in the notice and its good faith belief of its claims of copyright infringement. Currently, the DMCA only calls for the reporter's statements of accuracy and good faith belief without making them attestations under penalty of perjury."* "The second proposal is to provide a mechanism to require reporters to submit verified takedown requests. This will go some way to address not just the problem identified by the Megaupload test, but also alter the existing practice that simply assumes a reliable and accurate identification of the infringing content without accessing or downloading the infringing content itself. Content owners have always complained about the “whack- a-mole” problem wherein a disabled link to allegedly infringing content reappears in a new link."* "The third proposal calls for a mechanism to place a “cost” – a binding disincentive – on reporters for submitting bad or erroneous takedown requests. In the rush to stem the tide of piracy and in the absence of penalties for making “false positive” takedown request, reporters have tended to file takedown requests which a judge has described in one case as “overzealous and overreaching”.
Coverage of Study
|Level of aggregation:||Takedown Rquests|
|Period of material under study:||2008-2012|
|Level of aggregation:||Takedown Notices|
|Period of material under study:||2008-2012|