d78625b7b80c85a117daaedb8456aa92.ppt
- Количество слайдов: 30
Intermediary Liability Daphne Keller Center for Internet and Society February 2018 CC BY 4. 0 licensed 1
Intermediary Liability Law § Defines intermediaries’ legal responsibilities for content posted by users. § Often “notice and takedown, ” but sometimes complete immunity as with US CDA 230. § Different from voluntary “Terms of Service” or Community Guidelines removals § In some countries, governed by specific statutes § In some, governed by general law of tort, copyright, crime, etc. 2
Intermediary Liability Law § Obviously important to tech companies. § Protects them from financial losses, including bankruptcy -level losses under US copyright law § Incentivizes investment in open platforms (VCs are reluctant to fund in face of uncertain IL laws. ) § The smaller the platform, the more legal immunities matter (Google and Facebook can hire armies of moderators. Their next competitors can’t. ) § Why should anyone else care? 3
Intermediary Liability Policy Goals § Harm prevention § IL laws can reduce harms ranging from movie piracy to child pornography. Platform gatekeeper role makes them powerful enforcers. § Economic growth and innovation § Express purpose of many IL laws. At the extreme, bad IL laws would make platform businesses impossible. § Free expression § Platforms that fear liability take down legal speech to be safe. Investors that fear liability don’t fund open platforms. 4
Intermediary Liability Legal Dials and Knobs § Protect broader or narrower set of technologies § Permit greater or lesser degree of platform engagement with content (e. g. , does platform lose immunity for algorithmic recommendations or sorting? ) § Define culpable “knowledge” broadly or narrowly § Provide more or less detailed procedural rules for notice and takedown (e. g. , can a notice that does not list URL/location of allegedly unlawful content create liability? ) 5
Intermediary Liability Notice and Takedown in Real Life § Economic incentive to over-remove. § Multiple studies document over-removal. See Urban et al 2016, my CIS blog post listing other studies. § Censorship goals: Ecuador, Retraction Watch § Commercial goals: takedowns targeting competitors are very, very common § Why so many empirical studies about the DMCA? A note on national law and platform transparency. § “Wrongful removal” claims against intermediaries are close to impossible in US. I’ve never seen a claim succeed elsewhere, but that may change. 6
Intermediary Liability Cases on IL and Free Expression § Belen Rodriguez v. Google, Argentina 2014 § No monitoring obligation, because of threat to speech § Public adjudication/order standard for many removals § Shreya Singhal, India 2015 § Public adjudication/order standard for all removals § European Ct of Human Rights cases (47 states) § Delfi, 2015: OK to make news forum monitor for hate speech § MTE, 2016: NOT OK to make news forum monitor for defamation / personality rights infringement § U. S. Cases from the 1990 s § Zeran, Cubby, CDT v. Pappert 7
Intermediary Liability Human Rights Sources § UN and regional Free Expression Rapporteur materials § Organization of American States Rapporteurs really led the way on this. § 2013 (Botero) and 2016 (Lanza) reports emphasized procedural safeguards, need for judicial review, problems with monitoring requirements § Council of Europe materials § New scholarship on § Possible state obligations (a) not to strongarm platforms to remove legal speech, (b) not to tolerate shoddy removal practices. § Private platform obligations. 8
Intermediary Liability, Free Expression, and Procedural Safeguards “Amateurs talk tactics. Generals talk logistics. ” - Facebook spokesman describing notice and takedown issues, quoting his commander in the Marine Corps. 9
Intermediary Liability Procedural Safeguards § OSP removes content based on knowledge that it is unlawful § Per formal notice process (DMCA) § “Red Flag” knowledge (DMCA) § Sometimes court order standard (Brazil, Chile, ~Spain) § Opportunity for accused speaker to provide “counter -notice” defending speech. § Mexico versus Spain on RTBF and publisher notice § Penalties for groundless, bad-faith removal requests. § Gold standard: Manila Principles 10
Intermediary Liability Limits of Counternotice and Appeals § Used very little in practice § Some cannot be used without conceding legal arguments, including jurisdiction (DMCA) § Not effective for anonymous speech § Not effective as protection for right to access information, as opposed to speech rights § Counter-argument: If an individual won’t stand up for her speech, why should laws or platforms protect it? 11
Intermediary Liability US Intermediary Liability Law § DMCA: highly formal notice and takedown for copyright § Where DMCA does not apply, plaintiffs may still be unable to establish copyright secondary liability § CDA 230: complete immunity for leaving most content up, “good Samaritan” immunity for taking it down § Gaps between the two: intellectual property, federal crimes, and a few oddball claims § Statutes occupy the field. That means US courts mostly haven’t looked at the First Amendment/free expression issues in two decades, while courts in other countries have. 12
Intermediary Liability Communications Decency Act 230 (47 U. S. C. § 230) § Extremely broad immunity § Express economic goal (to promote Internet development) and free expression goal § Pragmatic calculation: § Platforms that monitor for offensive speech risk being considered editors and held liable. (This happened in US pre-CDA 230 and happens in Europe now. ) § Congress thought they would do more to “clean up the Internet” if they have a broad immunity. § Frequent legislative attacks, including SESTA now 13
Intermediary Liability Boundaries of CDA Immunity § Platform can’t develop content, in whole or in part § Roommates case: “The CDA does not grant immunity for inducing third parties to express illegal preferences. Roommate’s own acts—posting the questionnaire and requiring answers to it—are entirely its doing and thus section 230 of the CDA does not apply to them. ” 14
Intermediary Liability Digital Millennium Copyright Act (DMCA), 17 USC § 512 § The DMCA is: a strong source of immunity for copyright damages claims. If an intermediary qualifies for DMCA protection, and follows the removals process for complaints, it is well protected. § The DMCA is not: a law obliging an intermediary to remove content, or indeed to do anything at all. § Corollary: If an intermediary does not have copyright liability anyway, then it has no legal reason to comply with DMCA requests. § That said, two layers of defense are better than one. 15
Intermediary Liability Using the DMCA Safe Harbors 1. 2. 3. 4. Service must be listed in 512(a)-(d). Provider must satisfy some simple, logistical prerequisites. Most providers must properly respond to valid DMCA notices. Provider must not run afoul of some known, and fiercely litigated, DMCA hot issues. 16
Intermediary Liability 1. DMCA-Eligible Services There are safe harbors for 4 kinds of services: § 512(a) Transitory Communications § 512(b) System Caching § 512(c) Hosting at the direction of a user § 512(d) Indexing Cases are overwhelmingly about (c). 17
Intermediary Liability 2. Logistical Prerequisites for DMCA § Have and communicate to users a repeat infringer policy. § Sounds simple, but platforms mess it up all the time. § Register an agent with USCO. § New rule, not in statute: renew every 3 years § Provide contact information for removal requests. 18
Intermediary Liability 3. Responding to Notices § Notice must provide specified info from 512(c)(3) § Inadequate notice can trigger duty to coach. § Inadequate notice cannot create knowledge. (c)(3)(B)(i). § Intermediary has no duty to monitor. (m) § Intermediary must remove “expeditiously. ” § Counternotice: host “takes reasonable steps promptly to notify” the accused, who may counternotice as specified in 512(g). § Host “promptly” notifies complainant of counternotice. § 10 -14 days after receipt of counternotice, host reinstates content unless complainant notifies host that it is suing the accused. 19
Intermediary Liability 4. Hot Issues § “Red Flag” issue: Host must not know of infringement or be “aware of facts or circumstances from which infringing activity is apparent. ” 512(c)(1)(A)(2). § Host must not “receive a financial benefit directly attributable to the infringing activity, in a case in which the service provider has the right and ability to control such activity. ” 512(c)(1)(B). § Host’s activities must not exceed scope of 512(c) safe harbor. § “Take down, stay down” demands vs. 512(m). 20
Intermediary Liability Outside the US § Most comprehensive system is EU e. Commerce Directive § Source of major political pressure now § Many countries have law for just one aspect, i. e. just for copyright (Chile) or just for ISPs (Japan) § Many have no specific law, but – as in Belen Rodriguez in Argentina - reason from general tort and rights principles. § Some rely on control of local architecture § China’s “Great Firewall” § Turkey or Russia ordering national ISPs to block sites or apps 21
Intermediary Liability European Union e. Commerce Directive § Directive roughly like a treaty implemented in national law § Overlap with other Directives on topics like Intellectual Property § Substantially about economic harmonization (contract formation, regulated professions online, etc. ). § Our focus is on intermediary liability provisions. 22
Intermediary Liability e. Commerce Directive § Art 12: Safe harbor for “mere conduit” information transmission. § Art 13: Safe harbor for caching, must be “automatic, intermediate, and temporary” § Art 14: Safe harbor for hosting, provider must remove upon obtaining knowledge of illegality § Art 15: “No general obligation to monitor” 23
Intermediary Liability e. Commerce Directive vs. DMCA § Covers all claims, * not just copyright § No process specified at Directive level, rarely specified in national law § No indexing safe harbor § Article 15 “no general obligation to monitor” in tension with provisions allowing national courts to issue injunctions to PREVENT infringement * Except maybe data protection. 24
Intermediary Liability e. Commerce Hot Issues § When is does a host lose safe harbor for being too “active”? Analogous to DMCA “right and ability to control” but EU case law far worse for platforms. (draft Audio Visual Media Services Directive) § Compulsory “voluntary” commitments? (Hate Speech Code of Conduct) § Can EU courts compel global deletion/delisting? (Google and Facebook cases) § What are the real limits on compelling intermediaries to monitor user expression? (Communication, draft Copyright Directive, Facebook CJEU case) 25
Intermediary Liability Pressure to Monitor or Filter User Content § Terrorism concerns § Strong statements from May, Macron, G 7, G 20… § OSPs built shared hash database for filtering terrorist content § September 2017 Communication from Commission § Copyright concerns § EU draft Directive § US DMCA 512 Notice of Inquiry § Improved filtering technology § But still error-prone, can’t identify context § Very expensive ($60 million for You. Tube’s Content. ID) 26
Intermediary Liability Limits on Monitoring Requirements? § Ecommerce Directive Article 15 § CJEU L’Oreal case: some concrete parameters on what “general” means in Art. 15 § Human Rights under EU Charter or European Convention § CJEU SABAM cases: monitoring burdens users’ rights to privacy and free expression, host’s right to conduct business § ECt. HR Delfi and MTE rulings: Making a news site monitor user comments for defamation violates users’ info/expression rights. Making it monitor for hate speech 27
Intermediary Liability Current Developments § Digital Single Market inquiry § Decision to Leave e. Commerce Directive intact § No change to safe harbors § No change to Art 15 monitoring rules § Proposed Legislation § Copyright Directive Art 13 § Audio Visual Media Services Directive § Communication on Tackling Illegal Content § Non-binding strong statement that OSPs should monitor 28
Intermediary Liability Commission Sept. 2017 Communication Hosts should § § § “Proactively detect, identify, and remove” illegal content. Notify law enforcement. Automated removal without human review is fine “e. g. in cases of material whose removal is notified by law enforcement authorities” (not courts, but police) This is OK because § Counter-notice process can correct errors. (But see all known data on counter-notice). § Hosts are too “active” for Art. 14 immunity. § Monitoring for specific content is not “general” under Art. 15. 29
Intermediary Liability Thank You https: //cyberlaw. stanford. edu/about/people/ daphne-keller @daphnehk 30
d78625b7b80c85a117daaedb8456aa92.ppt