Perel and Elkin-Koren (2017)

From Copyright EVIDENCE

Advertising Architectural Publishing of books, periodicals and other publishing Programming and broadcasting Computer programming Computer consultancy Creative, arts and entertainment Cultural education Libraries, archives, museums and other cultural activities

Film and motion pictures Sound recording and music publishing Photographic activities PR and communication Software publishing Video game publishing Specialised design Television programmes Translation and interpretation

1. Relationship between protection (subject matter/term/scope) and supply/economic development/growth/welfare 2. Relationship between creative process and protection - what motivates creators (e.g. attribution; control; remuneration; time allocation)? 3. Harmony of interest assumption between authors and publishers (creators and producers/investors) 4. Effects of protection on industry structure (e.g. oligopolies; competition; economics of superstars; business models; technology adoption) 5. Understanding consumption/use (e.g. determinants of unlawful behaviour; user-generated content; social media)

A. Nature and Scope of exclusive rights (hyperlinking/browsing; reproduction right) B. Exceptions (distinguish innovation and public policy purposes; open-ended/closed list; commercial/non-commercial distinction) C. Mass digitisation/orphan works (non-use; extended collective licensing) D. Licensing and Business models (collecting societies; meta data; exchanges/hubs; windowing; crossborder availability) E. Fair remuneration (levies; copyright contracts) F. Enforcement (quantifying infringement; criminal sanctions; intermediary liability; graduated response; litigation and court data; commercial/non-commercial distinction; education and awareness)

Source Details

Perel and Elkin-Koren (2017)
Title: Black Box Tinkering: Beyond Disclosure in Algorithmic Enforcement
Author(s): Perel, M., Elkin-Koren, N.
Year: 2017
Citation: Perel, M. And Elkin-Koren, N. (2017) Black Box Tinkering: Beyond Disclosure in Algorithmic Enforcement. 69 Fla. L. Rev. 181.
Link(s): Definitive , Open Access
Key Related Studies:
Discipline:
Linked by:
About the Data
Data Description: The researchers set out to gather data on the behavior of Online Service Providers over the lifecycle of a typical uploaded work of user-generated content: from filtering at the moment of upload, to receipt and handling of takedown notices, to the removal of content and notification of the uploader. To accomplish this, the research team uploaded and tracked various purpose-made clips with paired controls. For example, one clip contained non-infringing footage, but copyrighted music, while another contained only the footage. Thereafter, the survival rate of the clips was monitored.

The researchers obtained ethical approval for the study from the Israeli university ethics committee, and notified the platforms at the conclusion of the study that they had conducted a social science experiment.

Data Type: Primary data
Secondary Data Sources:
Data Collection Methods:
Data Analysis Methods:
Industry(ies):
Country(ies):
Cross Country Study?: No
Comparative Study?: No
Literature review?: No
Government or policy study?: No
Time Period(s) of Collection:
  • 2013
Funder(s):
  • I-CORE Program of the Planning and Budgeting Committee
  • The Israel Science Foundation

Abstract

“The pervasive growth of algorithmic enforcement magnifies current debates regarding the virtues of transparency. Using codes to conduct robust online enforcement not only amplifies the settled problem of magnitude, or “too-much-information,” often associated with present- day disclosures, but it also imposes practical difficulties on relying on transparency as an adequate check for algorithmic enforcement. Algorithms are non-transparent by nature; their decision-making criteria are concealed behind a veil of code that we cannot easily read and comprehend. Additionally, these algorithms are dynamic in their ability to evolve according to different data patterns. This further makes them unpredictable. Moreover, algorithms that enforce online activity are mostly implemented by private, profit-maximizing entities, operating under minimal transparency obligations. As a result, generating proper accountability through traditional, passive observation of publicly available disclosures becomes impossible. Alternative means must therefore be ready to allow the public a meaningful and active interaction with the hidden algorithms that regulate its behavior.

This Essay explores the virtues of “black box” tinkering as means of generating accountability in algorithmic systems of online enforcement. Given the far-reaching implications of algorithmic enforcement of online content for public discourse and fundamental rights, this Essay advocates active public engagement in checking the practices of automatic enforcement systems. Using the test case of algorithmic online enforcement of copyright law, this Essay demonstrates the inadequacy of transparency in generating public oversight. This Essay further establishes the benefits of black box tinkering as a proactive methodology that encourages social activism. Finally, this Essay evaluates the possible legal implications of this methodology and proposes means to address them.”

Main Results of the Study

The researchers found that 25% of video sharing websites and 10% of the image sharing websites tested in Israel made use of some ex ante filtering technology at the point of upload. 50% of the video sharing websites removed infringing content upon receipt of a notice, while only 12.5% of the image sharing websites did so. After removing any content, all of the video sharing websites tested notified the uploader about the removal, while only 11% of the image websites did so. The wide variation in practices between online platforms suggest problems with procedural justice in the Israeli setting observed by the researchers. They also noted the presence of false positives (removal of non-infringing content when asked to do so) as evidence of the failure of human/algorithmic systems of handling Notice and Takedown procedures.

Policy Implications as Stated By Author

Whilst the authors do not offer any explicit policy recommendations, they advocate that the public undertake “black box tinkering” as a method to uncover the hidden functionality of algorithms and hold them accountable. In the context of the intermediary liability regime this means conducting experiments on live platforms under conditions controlled by the researcher to test how algorithms like YouTube’s ContentID system react to various inputs.


Coverage of Study

Coverage of Fundamental Issues
Issue Included within Study
Relationship between protection (subject matter/term/scope) and supply/economic development/growth/welfare
Green-tick.png
Relationship between creative process and protection - what motivates creators (e.g. attribution; control; remuneration; time allocation)?
Harmony of interest assumption between authors and publishers (creators and producers/investors)
Effects of protection on industry structure (e.g. oligopolies; competition; economics of superstars; business models; technology adoption)
Green-tick.png
Understanding consumption/use (e.g. determinants of unlawful behaviour; user-generated content; social media)
Coverage of Evidence Based Policies
Issue Included within Study
Nature and Scope of exclusive rights (hyperlinking/browsing; reproduction right)
Exceptions (distinguish innovation and public policy purposes; open-ended/closed list; commercial/non-commercial distinction)
Mass digitisation/orphan works (non-use; extended collective licensing)
Licensing and Business models (collecting societies; meta data; exchanges/hubs; windowing; crossborder availability)
Fair remuneration (levies; copyright contracts)
Enforcement (quantifying infringement; criminal sanctions; intermediary liability; graduated response; litigation and court data; commercial/non-commercial distinction; education and awareness)
Green-tick.png

Datasets

{{{Dataset}}}