[Return to Sprint Overview]

Statement of Best Practices towards Takedown Transparency

Jointly prepared by Working Groups 3 and 4 (Supergroup Red), as part of the Research Sprint on Takedowns and Transparency Research Sprint: Global Norms, Regulation and the Nature of Online Information hosted by the Berkman Klein Center for Internet & Society at Harvard University.

Team members/Authors:

  • Berdien Bernarda Erika van der Donk (Working Group 3)
  • Charles Culioli (Working Group 3)
  • Eren Sözüer (Working Group 4)
  • Inika Serah Charles (Working Group 4)
  • Snigdha Bhatta (Working Group 4)
  • Tavishi Ahluwalia (Working Group 3)
  • Torsha Sarkar (Working Group 3)

Table of Contents

Statement of Best Practices towards Takedown Transparency1

Table of Contents2

List of Abbreviations3

Introduction4

Foundational Principles of the SOBP6

Applicability9

Methodology9

STATEMENT OF BEST PRACTICES10

General Best Practices10

Phase-wise Best Practices11

PHASE 1. Transparency on TDRs: Request visibility11

PHASE 2. Transparency on OSP takedown procedure: Decision-making visibility14

PHASE 3. Transparency post-takedown decisions: Decision visibility17

Operational Best Practices19

INTERNAL REFLECTIONS ON THE CREATION OF AN SOBP22

Reflection on the groups’ process22

1.1. Reflection on working group 3’s project22

1.2. Reflection on working group 4’s project33

Brainstorming33

Forming the SOBP35

The Tripartite Model: General, Phase-wise, and Operational Best Practices36

Lessons Learned37

Unresolved Questions and Looking Ahead38

Post-mortem improvements: what’s next?39

Annex: Glossary of terms40

List of Abbreviations

  1. OSP: Online Service Provider
  2. SOBP: Statement of Best Practices
  3. TDR: Takedown Request
  4. RTBF: Right to be Forgotten
  5. CSAM: Child Sexual Abuse Material
  6. NCII: Non-consensual Intimate Imagery
  7. DMCA: Digital Millennium Copyright Act

Introduction

The following document presents a draft statement of transparency best practices (SOBP) that OSPsare recommended to adopt when responding to and reporting on TDRs. Our SOBP aims to initiate a discussion on transparency in all phases of a takedown.

The decision to focus on defining these stages of takedown has been a deliberate one. Numerous statements of best practices exist - the latest being the Council of Europe’s recommendation on the impacts of digital technologies on freedom of expression, published early April 2022. When discussing how exactly we as a group could contribute to these existing guidelines in the limited amount of time allocated to this sprint, we quickly realized that the short time frame of this research sprint would not allow us to produce an equivalent statement as the Council’s one, nor that producing a similar product would be beneficial.

The first phase of the SOBP concerns pre-decision transparency. It covers the period prior to the takedown decision and focuses on the extent of TDRsand prerequisites for justifiability of a takedown decision. The second phase relates to the takedown decision. This stage covers the review of TDRs by OSPs and focuses on the procedural and organizational requirements for making the takedown decision. The third and last phase deals with post-decision transparency. This stage covers the period after the takedown decision and focuses on the transparency requirements that allow checks-and-balances.

In addition, transparency in content takedown by OSPs is addressed by deliberating on a few preliminary considerations:

1.Whether to make the decision transparent or the process of producing that decision transparent?

2.How best to address retrospective transparency?

3.Whether to create transparency best practices for OSPs with or without the involvement of specific receptors?

While the SOBP recognizes that sunlight may not always be the best disinfectant and is conscious of the risks of over-transparency, it aims to provide insight into both the process of takedown as well as the event that follows after such process to reinforce the idea that transparency is an indispensable value in all stages of the takedown process.

We recognised that there are certain best practices with regard to transparency that would be applicable across all phases of the takedown process, which we have segregated into General and Operational Best Practices.

Transparency goals specific to different stakeholders and at different stages of the takedown are critical, and we recognized that it is equally important to provide a General Best Practices that most OSPs, if not all, can buy into. Such best practices may not necessarily depend on the takedown of notice itself, i.e. they are phase-agnostic in nature and would include non-derogable measures such as preparing reports after segregating the type of TDRs s, making room for country-specific reporting etc.

More importantly, we realized the importance of identifying loopholes or lacunae in the takedown system, especially if they have been prone to abuse in the past and so, found it necessary for OSPs to include measures to tackle such instances of abuse. Therefore, this SOBP also addresses another fundamental aspect of transparency: accessibility. To that extent, the SOBP includes a separate section on practical or Operational Best Practices, wherein aspects related to readability, user-interface, and visual adaptability are included.

We also recognised that OSPs function at varying scales, where large market players have the capacity to adhere to increased transparency obligations which may be onerous for smaller players. Through the SOBP, we mark out the transparency obligations which could be considered to be made applicable to only ‘Large Scale OSPs’. Similarly, there are instances where reporting on granular and sensitive data would be more appropriate to selected researchers, and not to the larger group of users. We have marked out these suggested segregations in transparency requirements as well.

Importantly, we are cognizant of the fact that there are a number of limitations of such an SOBP to be adopted by OSPs. These include:

  • This SOBP is intended to serve as a baseline guide for OSPs to foster increased transparency in their handling of TDRs s.
  • This SOBP is not legally binding on OSPs, and would serve as voluntary guidelines.
  • There may be conflicting legal obligations (such as, under local laws) which would prevent the OSPs from fully implementing this SOBP.
  • In some instances, the existence of overlapping and conflicting rights would prevent full compliance with the SOBP. Such rights include the right to freedom and expression and the right to privacy and data protection.
  • The members of this research sprint would have liked to further interrogate the use of automated tools in the decision making process of the handling of TDRs further, and strived to address this issue to our best capability but were limited due to lack of technical knowledge.

Foundational Principles of the SOBP

Working on the premise that platforms moderating online content are no different than judicial authorities, it was necessary to first form the questions for which we sought answers from platforms. (a) What are the standards on the basis of which takedown decisions are taken? (b) Who is making the decisions? (c) Are requests being met with a fair and proportionate response and in line with due process norms? (d) What type of content is reviewed by humans and what type by automated tools? (e) Do takedowns maintain balance between competing interests, e.g., privacy and scientific research? As some of these predominant questions have previously been addressed in the Santa Clara Principles On Transparency and Accountability in Content Moderation and the New America Transparency Reporting Toolkit, the attempt was to build on foundational principles that were relied on by those institutions.

Along with universally accepted principles such as the Universal Directive of Human Rights, United Nations General Principles on Business and Human Rights, as well as instruments with industry buy-in such as the GNI Principles on Freedom of Expression and Privacy, mechanisms for algorithm accountability, such as those by the Algorithmic Accountability Agency Framework (A3 Framework), further attention might need to be paid on the following principles:

  1. Due Process: The overarching foundational principle is that of Due Process, and one that should permeate into each stage of the content takedown process.
  2. Predictability: OSPs should aim to create principles that create a sense of predictability in users. Creating a predictable atmosphere would entail clearly defining each and every act that is considered a violation by the platform.
  3. Explainability: OSPs should publish clear, accessible andreadable transparency reports at frequent time intervals. Explainability would also entail that thesereports be made available in the language(s) of each country, so that takedown and appeals decisions reflect the language, culture, political and social context, as well as the legal framework of the country in which content is removed.
  4. Stakeholder-specificity: Transparency should be made context specific, i.e. maximum transparency approach to government requests whereas a minimum transparency approach to content involving CSAM, NCII etc.
  5. Accessibility: Not only should takedown notices be brought within closer public view but to ensure actual accountability, users and relevant stakeholders should have access to details of globally applicable or jurisdiction-specific local laws relevant to the specific content, details of any formal or informal working relationships and/or agreements the company has with state actors when it comes to flagging content or accounts or any other action taken by the company, and other granular information that are relevant to assess accountability.
  6. Contestability: A robust grievance redressal mechanism should be in place to allow alleged infringers to file counter notices. A fixed time frame must be set within which OSPs should respond to such users, with a recourse to further appeal.

In the pre-decision phase:

The pre-decision phase covers the period before an OSP takes a takedown decision and focuses on the information on permissible content and behavior that is available to the users of OSP services. Before a takedown decision is taken, users must be transparently informed of the rules applicable to the platform. The terms of service of any OSP service should clearly state the types of content that will be removed and the types of behavior that can lead to (permanent) exclusion from the platform.

In the takedownphase:

The takedownphase deals with the internal mechanisms through which OSPs process TDRsand arrive at decisions. It spans the time between the receipt of a TDR and the OSP's ultimate action. Important components of this phase include composition and constitution of internal institutions and organizational hierarchies, resource allocation, internal guidelines on dealing with different categories of TDRs , internal deliberations on decisions, the extent of use and efficacy of automated tools, efficacy and training of human reviewers, etc. Consequently, meaningful transparency in this phase shouldallow decision-making processes to be assessed against fairness and non-discrimination, consistency, and predictability. Similarly, internal institutions should be transparent to enable representation, especially to historically marginalized communities.

In the post-decision phase:

The post-decision phase covers the period after the takedown decision and focuses on the transparency requirements that allow checks-and-balances. At this time of the takedown process, respecting due process equals granting a right to appeal to users. Balance of interests has to be struck between the right to privacy and transparency on the right to appeal. Transparency also ensures that due process is respected. As a result, transparency in this phase should include redress mechanisms such as appealsboth by OSPs and other stakeholders as additional checks on takedown.


Applicability

The best practices apply whether OSPs take down content proactively or reactively, as long as the takedown process is required by external actors or triggered by external rules.

Unless otherwise indicated, the best practices apply to all OSPs, irrespective of size or type. Best practices marked out with a (🔺) only apply to Large Scale OSPs, as defined in the glossary. OSPs that do not qualify as such are strongly encouraged to comply with these obligations.

Best practices marked out with a (●) pertain to transparency afforded to researchers, as defined under the best practice, “Researcher Access.”

Methodology

PART I. GENERAL BEST PRACTICES

These best practices are applicable across all phases or tiers: They aim to answer questions regarding what is removed, the norms applicable to removals, and how instances of abuses of the takedown mechanism notices are tackled.

PART II. PHASE-WISE BEST PRACTICES

These best practices are specific to the phase in the takedown process: (1) Transparency on the TDRs themselves; (2) Transparency on the decision-making procedure on how the TDR is handled; and (3) Post-hoc transparency, or transparency on practices after the decision on the TDR has been made by the OSP.

PART III. OPERATIONAL BEST PRACTICES

The focus of these best practices is to ensure that the transparency report is readable, periodic and drafted in a manner that sets out the data in the most user-friendly format possible, whether the object of scrutiny can be seen by the relevant parties or not, whether the stakeholders can observe the conduct of those that govern.

Legend

Symbol

Meaning

🔺

Applicable for Large Scale OSPs

Part of Researcher Access


STATEMENT OF BEST PRACTICES

I. General Best Practices

  1. Providing information on norms/rules applicable to TDRs

OSPs should clearly display their rules on content moderation on their website. The rules should specify the OSPs’ takedown practices under applicable law and under internal policy.

  • Separate reporting for different categories of TDRs
  • Jurisdiction-specific reporting
  • Countering misuse of TDR procedures 🔺
  • Providing information on OSP’s policies regarding specific requesters 🔺

Segregating reports, or providing separate sections to address different types of TDRs received and processed is a good practice, which will allow the data released by OSPs to be better appreciated. Different types of TDRs, such as copyright, government requests, RTBF requests have different considerations. The provision of segregated information would be of great help to users and researchers alike.

Keeping in mind that the takedown processes and numbers for different jurisdictions may differ widely, OSPs should provide jurisdiction specific data. This would not only demonstrate the number of countries from which TDRs are received, but also bring to light other insights - such as which countries' governments are more active in the removal of content from the internet. This could include:

  • A list of countries from which TDRs are received.
  • The number of TDRs that are received from each country.

There are instances in which takedown mechanisms are misused by users to have content removed. Disclosures in this regard, as part of transparency reporting, could be helpful in devising solutions to counter instances of such misuse of the takedown procedure. Such information should include:

  • Providing information on measures taken to prevent or counter abuse of TDRsin general, as well as measures that relate to specific takedown procedures. Such measures should be supported by sufficiently detailed explanations of representative cases.
    • Good practice: In the copyright delistings transparency report, Google indicates that they reject TDRs which, upon investigation, are found to be deceptive.[1]
  • Providing data on the internal systems in place to identify bot accounts that send automated TDRs.

For certain categories of TDRs, OSPs may have a separate operating procedure and policy for particular requesters and reporting agencies. Policies aroundtrusted flaggers may be such an example.

OSPs who have such procedures in place ought to publish them for increased transparency. This would include:

  • The requesters for whom such special policies are in place, along with examples of such trusted flaggers and reporting agencies.
  • The procedures in place for TDRs sent by all such requesters.
    • OSPs should publish aggregate data on all TDRs, which should be segregated based on the category of request. For example, requests pertaining to copyright, the RTBF, legal demands, and local laws should all be reported on separately.
    • Categories of requests should be as refined as possible, depending on the type of service and product. For instance, the RTBF and defamation requests both pertain to local laws, but should be reported on separately. In such cases data should be provided for both the broader category and sub-categories.
    • The following categories, as a minimum, are recommended:copyright, trademark, counterfeit, legal demands, local laws, RTBF, defamation, CSAM, NCII, emergency requests.
    • OSPs should define and explain the scope of each category, referring to applicable laws or policies, and provide representative examples of requests.
    • OSPs should report on the normative basis of TDRs (e.g. DMCA, GDPR, German Network Enforcement Law), both in the aggregate and in relation to each TDR type.
    • In addition to categorizing requests based on type, large-scale OSPs should report on the issues that TDRs relate to, e.g. drug abuse, terrorist content, national security, both in the aggregate and in relation to each TDR type. This should include the total number of TDRs per issue along with sufficiently detailed and intelligible explanations of the scope of each issue. 🔺
    • TDRs submitted by individuals:

II. Phase-wise Best Practices

PHASE 1. Transparency on TDRs: Request visibility

  1. Segregating and defining the categories of TDRs
  • Good practice: Twitter’s divisions such asbetween “legal demands” and “local laws”[2]

For Large Scale OSPs, this should include the total number of TDRs per law along with sufficiently detailed and intelligible explanations of the law. 🔺●

  1. Reporting on requester profile

OSPs should provide data on requester profiles.

  • As a rule, only high-level, aggregate data should be published. and individuals sending the TDRs should not be identified. Only percentage and total number of requests by individuals should be reported.
  • Where applicable, and depending on the type of request, individualrequests maybe broken down into categories. This would depend on the status of the individual, for instance, in the case of a public figure and a RTBF request. 🔺
  • If the requester is a public figure or the request pertains to an issue of public interest, the identity of the requester should be reported except where such reporting is outweighed by other interests (e.g. privacy or limitations imposed by local law). 🔺
    • TDRs submitted by non-individuals:
  • As a rule, names of non-individual requesters, e.g. state/government, NGO, trusted flagger, reporting agency, and corresponding percentage and number of submitted TDRs should be reported (except where such reporting is prohibited by limitations under local law).
  • If a request is from the government, the specific agency from which the request originated, should be reported, e.g. judiciary, military, law enforcement.
  • With regard to converging requester profiles, a distinction should be made depending on the origin of the request. For example, if the TDR was originally triggered by an individual’s demand, it should be primarily classified as such, irrespective of the entity submitting the request to the OSP. When there is such a convergence, this should be reflected in the data.
  1. Reporting on the object of the TDR

The data should indicate the kind of item that is subject to the TDR, e.g. content, account, hashtag, group, page. In addition to such broader item categories, where applicable, granular data on each category should be reported, e.g. for content, the data should indicate whether the targeted content is an image, text, tag, etc.

  1. Reporting on specialized policies

OSPs should report on whether any particular policies apply to specific requesters, e.g. trusted flaggers, reporting agencies, and the relevant normative bases, i.e. internal policy or legal requirement. ●

PHASE 2. Transparency on OSP takedown procedure: Decision-making visibility

  1. Reporting on initiation of takedown for Large Scale OSPs🔺

OSPs should specify whether a takedown was in response to a particular request (reactive) or it was done proactively by the OSP to fulfill a legal requirement.

  1. Reporting on actions that may be taken

OSPs should publish a list of all actions that may be taken pursuant to TDRs .

  • Good practice: Distinctions made by Google (“remove v. block”), TikTok (“remove v. restrict”), and Twitter (“removal, restriction, disable access, cease communicating, cease making available”).

OSPs should publish detailed explanations of each category of action, along with the criteria and representative examples of cases where action may and may not be taken, in order to illustrate the scope, applicability, and outcome of each action. The explanations should be sufficiently detailed to ensure foreseeability of actions and avoid vague, general terms such as “action shall be taken.”

  • Good practice: Google’s “highlights” in the RTBF transparency report[3]

  1. Granular reporting on actions taken

OSPs should report aggregate data on actions taken pursuant to TDRs, both in percentage and total numbers, broken into request categories. This report should also include cases where a TDR is rejected.

  1. Providing information on the make-up of the team of decision-makers

OSPs should explain processes in place, if any, that allow them to send takedown notices to a specific decision-making committee, or clarify if the assignment of cases to certain decision-making committees is a randomized process. They should also clarify:

  • What decision-making directives are in place to help committee members make accurate and consistent decisions,
  • whether such decisions are consensus-based or vote-based (or is one member of the team given veto?)
  1. Reporting on standard checks and operational guidelines in place to review the TDRs
  • OSPs should report on the standard operating procedures in place to determine if the TDR submitted complies with all the procedural or formal requirements, e.g. checking if government notices are duly authorized.
  • OSPs should report the checks (if any) in place to evaluate the substantive aspects of the notice, e.g. analyzing the government takedown notices that are in conflict with local law/user guidelines of the platform/international human rights, or checking if DMCA notices target content permitted under fair use.
  1. Reporting on means of reviewing TDRs

OSPs should report on the means that may be used to review TDRs , i.e. manual, automatic or both. This should include:

  • Scope of each method and accompanying safeguards, if any.
  • Information on the type of content that is sent automatically to automated takedown mechanisms versus content that is sent to human reviewers; how it is decidedto send a TDR to a human reviewer versus to an automated tool. 🔺
  • OSPs should report the ratio of each method used per request category and the outcome. ●
    • Providing demographic data on members of the decision-making board would allow users and researchers to appreciate the data on how TDR decisions are made, and give context for why certain content was or was not removed. Accordingly, OSPs should release aggregate data on:
  1. Reporting obligations regarding human reviewers for Large Scale OSPs 🔺
  • Number of staff employed in a jurisdiction for content review
  • Qualifications of staff (spoken languages etc)
  • Location of staff
  • Other particulars, such as age, nationality, race, and gender of staff
  • Whether human reviewers are outsourced to third-party firms and if yes, what working conditions, along with training, exist to facilitate the moderation of content.
    • OSPs should also release data on: 🔺
  • The selection process pertaining to the make-up of the decision making board and if the members in the decision board are the same as that of the appeal board.
  • What processes are in place to help moderators make consistent and “accurate” decisions, especially in borderline/ambiguous cases.
  • The extent to which human oversight is documented and what mechanisms exist to provide better training to human moderators, as well as the type of internal support in place to assist them in making consistent decisions.
    • OSPs should publish a list of all types of automated tools (e.g. filters, digital hashes) that may be used to detect, review, or take down content.
    • OSPs should publish sufficiently detailed information/explanations on automated tools. This should include, at the minimum:
    • Between jurisdictions: How are TDRsfrom jurisdictions with no on-ground platform presenceprioritized and handled?
    • Between categories of notices: Are certain categories of TDRsprioritized over others, e.g. Government notices or court orders?
    • Between categories of flaggers: Are notices by trusted flaggers prioritized over other users?
  1. Reporting obligations for automated review tools for Large Scale OSPs 🔺
  • How the tools work
  • Whom the tools are available to
  • Development process of the tool, e.g. whether there was input from civil society
  • How regionally, demographically, linguistically diverse the data are and what kind of results the automated models generate in response to such diverse notices
    • Good practice: Explanation of Microsoft’s PhotoDNA
  1. Length of decision-making process for Large Scale OSPs🔺

OSPs should report on the average time spent on TDRs upon receipt of request by the OSP (including internal timelines), broken into request categories.

  1. Reporting on prioritization of TDRs for Large Scale OSPs🔺

OSPs should report if and how TDRs are prioritized:

  1. Reporting on geographical scope of takedown

OSPs should report on whether takedowns apply universally or only in the jurisdiction in which they are requested (or to any other extent). Information on whether this policy varies across forms of content and across jurisdictions should also be provided.

  1. Additional disclosure mandates for large-scale OSPs 🔺
  • OSPs should disclose error rates for both automated and human reviewers.
  • OSPs should provide information on accuracy of takedown processes, which can include granular information on:
    • Data on false positives: when content is labeled infringing/violating but were not violations/infringements.
    • Data on true positives: when content is labeled as violating/infringing and were violating/infringing.
    • Data on false negatives: when content is labeled as not violating/infringing but actually were violations/infringements
    • Data on the number of erroneous decisions that were left un-appealed and how frequently OSPs proactively correct such erroneous decisions.

PHASE 3. Transparency post-takedown decisions: Decision visibility

  1. Informing users on takedowns

To enhance user-control, OSPs should provide users with an opportunity to address claims identified in the takedown notice so as to minimize OSP interference in disabling the content. This may be achieved through in-platform notifications, so that users can be provided with a self-check mechanism to either take down the content themselves or provide explanation to OSPs as to why their content is non-violating/non-infringing.

OSPs should also inform users on whether they may be subject to further action in relation to the takedown, e.g. termination of account.[4]

  1. Reporting on redress mechanisms for users
  • whether users can respond to TDRs , e.g. if any explanation can be provided to defend the content
  • how to appeal takedowns
  • how such an appeal would be adjudicated
  • by whom would it be adjudicated
  • how long the user can reasonably expect to hear from the appeal
  • whether decision from the appeal is further appealable
    • Large Scale OSPs should publish data on the appeals and outcome, broken into request categories: 🔺
  • number of appeals/counter notice received by stakeholders (separate reporting on separate stakeholders)
  • speed at which appeals/counter notices are reviewed
  • successful ratio of appeals per type of request
  • both number and percentage of unsuccessful appeals
  • place of origin of appeals
  • number/percentage of such appeals originating from takedowns by automated tools or human reviewer
  • number of erroneous decisions that were actually not appealed
  • data on whether material was removed instantly and later restored in response to a counter-notice or they waited for a counter-notice to be filed.
    • Onappealing TDR issued by courts or government agencies, OSPs should report, at the minimum, on the following:
    • Clearly outline reasons for sending a particular set of appealed cases for external review, if any (for instance, Facebook Oversight Board) accompanied by the procedure adopted by such external board while reviewing takedown decisions. 🔺
  1. Reporting on appeals by the OSP
  • Percentage and total number of appeals made per TDR category
  • Reasons for appealing the TDRs
  • The appeals process and outcome
    • Good practice: Google’s practice of updating original court order notice to include the result of the appeal
  • If there are processes in place to send appealed notices to a specific decision-making committee/board and if this process is random or deliberate?
  • If a board is making a decision, what decision-making directives are in place? Is it a consensus-based decision making or vote-based or is one member of the board given veto?
    • With consideration to data protection and other legal requirements, OSPs should maintain an archive of removed content of public interest as well as that of rejected requests (including the content) and provide researcher access thereto.
    • Researchers should be given access to data on the type of content that reviewers (both automated and human) tend to flag as “ambiguous” and the length of time taken to process such content.
  1. Sharing takedown data with researchers and repositories 🔺

III. Operational Best Practices

  1. Periodic reporting

OSPs should publish bi-annual transparency reports. However, there are certain instances captured below in which more frequent reporting would be necessary:

  • Large-scale OSPs should have in place a dedicated website where transparency reporting is updated more frequently, such as on a monthly or weekly basis and no less frequently than quarterly basis. 🔺
  • Researcher Data should be made available on a monthly basis.
  • In the event of extraordinary measures or crises, reporting of essential information should be made available to researchers and third party repositories such as Lumen.
  1. Granular reporting
  • OSPs should publish aggregate reports as well as reports specific to the request category.
  • Data should be broken down by different products offered by a company, where appropriate, e.g. Google search v. YouTube.
  • Reports should provide analysis of request trends, including comparison with previous reporting periods and overall. 🔺
    • Good practice: Twitter’s transparency reports
  • Statistical data provided in transparency reports should be presented in aggregate format.
  1. Contextual reporting
  • Under each request category, OSPs should provide examples from actual cases (e.g. Google’s highlights) in a principled and representative manner, including information on the post-takedown/action phase.
  • Explanations of a policy, law, decision etc. should be sufficient, intelligible, and succinct.
  • OSPs should provide for an FAQ section to address gray areas.
  1. Readability
  • Language should be uniform, different terminology should not be used interchangeably.
  • Reports should be available in as many languages as possible and at the minimum, in the language of the content subject to TDR. 🔺
  • The reports should be published in a user-friendly and intelligible transparency interface.
  • The reports ought to use flowcharts explaining the takedown process.
  • Using visuals, interactive graphics, and charts to demonstrate statistics will make the report more reader-friendly.
  1. Ease of reference
  • All data and visuals should be downloadable and machine-readable, e.g. Excel, JPEG, where applicable.
  1. Researcher Access 🔺

While general users would benefit from increased transparency reporting, additional granular and comprehensive data should be made available to selected researchers. Such qualified independent researchers, such as academics, archivists, librarians, and students, are to be given credentials to access data that is not accessible to the general public.

  • A system similar to Lumen’s tiered policy could be implemented:
    • researchers submit applications to the OSPs containing their credentials and an indication of their interest and work in the area.
    • a team evaluates their applications and grants access for a limited time period (2-3 months), which can be extended on request.
    • certain sensitive forms of TDRs (such as CSAM/NCII) to be allowed to be accessed by only specialized and vetted researchers.
  • Alternatively, OSPs could give third parties such as Lumen the responsibility to onboard researchers.
  • Content that is made available to researchers is termed as “Researcher Access.”
  • A form to request certain granular data should be provided for Researcher Access.


INTERNAL REFLECTIONS ON THE CREATION OF AN SOBP

1.Reflection on the groups’ process

1.1. Reflection on working group 3’s project

For Session 04 of the Sprint, the BKC had invited Brandon Butler, to discuss his experience in developing best practices around fair use in copyright. This session also acted as a starting point for our brainstorming process. In particular, we found his discussion on identifying the communities that are interested in these best practices, as an interesting perspective to adopt while thinking about our SOBPs.

During our first session as a working group therefore, we spent time discussing this community-oriented approach, and how that could apply for takedown transparencies. At the outset, we identified that unlike fair use in copyright, which had very defined communities (creators who are interested in fair use, for instance), the contours of the communities who would have a stake in takedown transparency, would be more complicated. Therefore, as a first step, we tried to comprehensively list all the communities that we thought would be interested in takedown transparency, and how transparency would benefit each of them. The list looked like this:

  • OSPs: Having better transparency standards around content takedown would improve trust among and goodwill with users, and will diminish liability risks from potential lawsuits.
  • Researchers: Transparency standards would ensure that there are checks and balances in the processes around content takedown and that they adhere to existing human rights standards. Researchers would also be interested in robust transparency standards, since that would allow them the academic freedom to conduct research on the OSP’s processes.
  • End-users (non-commercial): Transparency standards that clearly delineate what can/cannot be shared would lead to more legal certainty and a safer internet environment. Such standards might also advance the goal of freedom of expression online as well as better access to information.
  • Content creators (commercial): For content creators, clear transparency standards would serve their business goals, since such processes would presumably afford more due process in moderating content.
  • Government/policy makers: Transparency standards that clearly delineate what can/cannot be removed will afford government parties more legal certainty about enforcement, thereby reducing costs of having potentially infringing activity unnoticed.
  • Third parties/trusted flaggers: This was one community with which we initially struggled. We had a previous session with Professor Daniel Seng, where we discussed the phenomenon of trusted flaggers misusing their position to send overbroad and arbitrary DMCA takedown notices to OSPs. In that light, we found it difficult to reconcile what sort of benefits trusted flaggers would have in case of increased transparency standards. Tentatively, we believed that similar to content creators, a level of due process instituted in content takedown procedures might also ultimately be beneficial to trusted flaggers as well.

During this deliberation, we also had the opportunity to deliberate if these communities have interests that go against transparency. This is what we established:

  • OSPs: Stringent transparency norms might interfere with an OSP’s freedom to conduct business. OSPs can also contest mandated disclosures around algorithmic content moderation, since that might go against the proprietary nature of these technologies.
  • Researchers: Depending on how best practices on transparency are established, researchers might find it difficult to access data on the OSP’s policies. For instance, if the best practices ask OSPs to share data with a narrow, tailored class of people, that could work to the exclusion of the rest of researchers.
  • End-users (non-commercial): In select cases, more transparency might directly go against user-interests. For instance, on cases involving non-consensual dissemination of intimate/sexual imagery, the aggrieved user would have a strong interest in ensuring that data around moderation of such content is not made available to the general public.
  • Content creators (commercial): Unequal access to knowledge about how OSPs moderate content might be beneficial to content creators who privilege from such inequality. As such, more transparency might go against such interests, since that would introduce more competition.
  • Government/policy makers: For ‘sensitive’ content, governments might cite national security and public interest concerns, to oppose transparency. Similarly, governments might also have stakes in obscuring information around illegitimate oppression of speech.
  • Third parties/trusted flaggers: Malicious trusted flaggers would not be interested in stricter procedures, because they will be less successful in taking down non-infringing content.

Content moderation is often addressed as a static point in time: the takedown. However, from the initial brainstorming assignment for session 0 (answering the questions ‘what is transparency to you?’ and ‘how can transparency be applied to takedown requests?’), we realized that transparency and takedown requests were interpreted differently by the different members. For instance, working group members pointed to the need for clear guidelines and procedures and more information on takedown decisions. Members alsofocused on due process when making decisions (e.g., the composition of institutional mechanisms), and the importance of post-moderation transparency to scrutinize the request that had led to the decision. Ultimately, upon introspection, we found that within our working group, these SOBPs spoke to transparency in three very different stages in the content moderation process: pre-decisional, the takedown and post-decisional.

We decided to merge our community-oriented approach with this three-stage framework. Accordingly, we decided to flesh out each community’s interests for and against transparency in each of the three stages of the content moderation process. We hoped that this mapping, across all communities, would allow us to discover some overarching transparency norms at each stage. This could then form the basis for our SOBP.

We arrived at this table:

Working Group 3’s

Three-stage content moderation

Stage 1:

Pre-decision transparency

This stage covers the period prior to the takedown decision and

focuses the prerequisites for justifiability of a takedown decision

Issues in this stage for the different communities

(conflicting interests / fundamental rights)

OSPs:

Against transparency: Freedom of contract/freedom to conduct a business/entrepreneurial freedom to decide what should and should not be on the platform.

In favor: Unclear guidelines on what should be removed can lead to potential liability, bad press/reputation for overblocking or underblocking. Better transparency standards may also incentivize them to push back against repressive censorship laws, by publishing more information about the arbitrary nature of the government taking down content.

Conflict between these interests: OSPs would presumably like more transparency on what to remove, but would want to keep some kind of freedom to run their own service in the way they please.

End-users:

Against transparency: N/A

In favor: Clearer guidelines on what can be removed/what the boundaries of the normative power of OSPs is, can create more legal certainty and more trust in the following decision on takedowns. Also empowers end-users to challenge removals outside of the scope.

Commercial users:

Against: N/A

In favor: (same as end-users)

Researchers:

Against: N/A

In favor: N/A

Trusted flaggers:

Against: Clearer guidelines take away the opportunity to send ‘opportunistic TDRs ’. Sending such TDRs is malpractice and should not be weighed too much for us, but it might be an underlying interest for this party to be against transparency guidelines.

In favor: more clarity on ‘vague content removals’ such as defamatory content. Most trusted flaggers work for commercial actors (IPR-based infringements) and those are already outlined well on most platforms due to strict legislation and liability risks.

Policy/law makers:

Against: Enforcement burden; who is going to handle complaints about non-transparency? Compelling OSPs to be less transparent may also help regulator obscure details around arbitrary content takedowns [for instance, see this]

In favor: The transparency requirements on what is removed by OSPs can serve as a basis to analyze the need for future legislation. For example, when OSPs are open about removing ‘sexually explicit content’ (as with OnlyFans), law makers can pick up the discussion on whether such content is illegal or not. It could also spark a discussion on whether (large-scale) OSPs should only be allowed to remove illegal content and whether a ‘must-carry obligation’ is necessary for non-illegal content.

Potential overarching aim: transparency in this first stage creates legal certainty for all parties.

Transparency goals in this stage:

Create transparency in what can and cannot be removed from OSPs services and limiting the endlessly broad normative power that OSPs can exert over takedowns on their service.

Can OSPs decide to remove non-illegal content from their platforms based on their terms of service? What requirements should those terms of service have in that case?

This stage does not concern problems with (the enforcement of) ‘illegal content’ as those are governed by law (illegality), but to content that is removed without a legal basis, such as misinformation, or the platform’s own decided ‘harmful content’ (such as sexually explicit content on OnlyFans).

Stage 2:

The takedown decision

This stage covers the takedown decision and focuses on the procedural

and organizational requirements for taking the a decision to remove content

Issues in this stage for the different communities

(conflicting interests / fundamental rights)

OSPs:

Against transparency: An OSP’s independence in running the businesses according profit imperatives will be impacted. More scrutiny can also lead to more criticism and more liability, as every decision made by the platforms can be contested. Finally, more transparency can lead to slower/ weaker internal processes due to high risks attached with decision making. This can increase transparency costs and be difficult for smaller/ newer OSP(s) to implement.

In favor: Clearer processes help build trust and legitimacy in the other communities. Clearly defined and publicly stated processes for takedown notices can potentially save platforms from taking difficult decisions with long term public consequences in silos and based on shifting goalposts( like determining what constitutes hate speech/ disinformation).

It can also help in initiating a more nuanced conversation and understanding of issues of online speech and building collaborative solutions.

Conflict between these interests: OSPs might want to share some data that give a higher level overview (percentage of decisions taken) but any more procedural transparency can be seen as a threat to the freedom of running their private business.

End-users:

Against Transparency: Editorial Transparency can have the negative implication of platforms taking decisions to please the regulators or the dominant political power in a particular jurisdiction. Potentially, too stringent transparency standards can hamper platform’s ability to take decisions in emergency circumstances. Finally, there can be privacy concerns for users in practices around data-sharing, as even anonymized data can be traced back.

In favor: Transparency in how notices are processed and decisions are taken can provide accountability to users.These will also act as a check against the disproportionate incentives to take down content and act as a check against discrimination where some users/communities become easy targets of takedown.

Transparency standards can also improve the review process behind each takedown notice by increasing the number of human reviewers, building better automated tools and increasing the presence of legal experts/ moderators/ reviewers from different jurisdictions.

Finally, transparency will act as a check against:

  • Platforms prioritizing certain jurisdictions while maintaining little to no mechanisms/ human reviewers/ automated tools in certain languages and dialects
  • Platforms following different mechanisms or taking arbitrary decisions in response to public backlash or political circumstances.
  • State interference/ backdoor channels of communication and censorship

Conflict between these interests: Increasing freedom of expression and rightto know versus right to privacy

Commercial users:

Against Transparency: Editorial Transparency can have the negative implication of platforms formulating procedures/ policies for takedown that favor bigger players due to business interests.

In favor: Transparency in how notices are processed and decisions are taken can:

  • Act as a check against the the OSP’sincentives to over-classify as copyright violation
  • Provide better information to raise counter-notices based on fair use
  • May also be a step in addressing power asymmetry between small content creators and powerful players using anti-privacy tools with automated tools.

Researchers:

Against transparency: There might be ethical considerations behind standards that mandate data-sharing, especially for sensitive content that can breach user privacy.

In favor: Provide insights into how/ why platforms make decisions on takedowns. This can lead to better understanding in many areas including:

  • Which users are more likely to be targeted in which categories of TDRs?
  • What kind of content is more likely to be taken down?
  • Whether the content taken down is actually illegal?
  • Are laws empowering governments to censor content being misused?

Trusted flaggers:

Against transparency: Presence of procedural transparency reduces the otherwise disproportionate incentives for platforms to take down content.

In favor: More clarity on how content takedown decisions are taken would result in more certainty and predictability with respect to flagged content.

Policy makers:

Against transparency: States might be interested in maintaining confidentiality of certain state notices, since better transparency might reduce the space for backdoor channels to platforms. Can potentially expose the state to be a party to litigations against platform’s decisions.

In favor: Transparency might provide checks against arbitrary decision making by platforms. As a result, states can better assess platform’s compliance with local laws.

Potential overarching aim: Transparency in the second stage can provide an insight into how/ why takedown decisions are taken and also have an effect of improving these internal processes to be more equitable and fair.

Transparency goals in this stage:

The incorporation of due Process in takedown decision making, which would include:

  • Clearly stated policies on how takedown notices in each category and each jurisdiction are processed.
  • How are takedown notices prioritized
  • Between Jurisdictions (if there is no on-ground presence for processing notices)
  • Between categories of notices (for instance are court/government orders prioritized?)
  • What part of the decision-making process is automated, and for what category of notices. Similar granularity for notices that require human intervention/review mechanisms. Under this further information on the following can be provided:
  • The error rates of such automated tools.
  • The availability of tools across jurisdictions and languages
  • If a notice is found to be procedurally incorrect what happens next?
  • What checks (if any) does the platform apply to check if the substantive part of the notice is correct?
  • Whether the government’s takedown notice is in violation of local law/ user guidelines of the platform/ international human rights?
  • What circumstances prompt platforms to make such reviews and evaluations?
  • What processes/ communication channels/ appeal mechanisms do platforms follow in such instances
  • The reasons for the action taken to be recorded clearly and unambiguously (for instance, stating whether the content was in violation of: Local laws/Platform Community Guidelines/Terms of Service/Court Order
  • Archiving and sharing content that has been taken down (if possible) or metadata of content (including engagement, demographic details of users) with researchers
  • Transparency on composition of internal institutions dealing with takedown notices.
  • Providing users whose content is taken down with a copy of the notice.

Stage 3:

Post-decision transparency

This stage covers the period after the takedown decision and focuses

on the transparency requirements that allow checks-and-balances

Issues in this stage for the different communities

(conflicting interests / fundamental rights)

OSPs:

Against transparency: Transparency might prevent OSPs from taking business-oriented decisions that would encroach upon stated terms and conditions. Also creates more opportunities for criticism of the OSP decision-making process and end-decisions, hampers speedy decision-making in case of emergency; and potentially scares off users by underlying over-removal of content before appeal.

In favor: Post-decision transparency guarantees end-users that the rules established by the platform are respected, accordingly, building trust with user communities. This can empower users to avoid self-censorship and allow for the creation of more user-generated content. Finally, established appeals processes might reduce risks of liability and judicial redress by providing users with information on appeals.

End-users:

Against transparency: If not well-designed, appeals processes might threaten user privacy.

In favor: Better appeals processes provide users with data to assess the respect of due process and right to appeal by OSPs; they allow users to avoid self-censorship due to fear of the lack of due process; ensures legal certainty as to which type of content can be freely published on the OSP and acts as a pedagogical tool and a hands-on guide on what to expect when appealing an OSP takedown decision.

Conflict between these interests: Transparency on the right to appeal is to be balanced with the right to privacy. In particular, it is important to consider how to use the concept of « meaningfully public data » to determine which data publication could be subject to less privacy scrutiny and more explainability on takedown decisions.

Commercial users:

Against transparency: Commercial users might have vested privacy interests on appeals processes, as the “name and shame” effects of transparency. This is because, commercial users who on average, have more audience, might be categorized as “meaningfully public”, which means more information on content takedown processes surrounding them might be published.

In favor: Good appeals processes would reassure commercial users that user-generated content cannot be deleted from the platform without due process . This in turn, also enhances freedom of expression by outlining the procedure used to prevent access to online infrastructure.

Researchers:

Against transparency: N/A

In favor: Better disclosures would allow researchers to study the breadth and usefulness of appeal procedures, allow researchers to evaluate the number of internal appeal procedures being litigated in courts and assess OSPs due process;

Trusted flaggers:

Against transparency: Appeals might disavow TDRs made by trusted flaggers.

In favor: Reports on efficiency of TDRs by trusted flaggers.

Policy makers:

Against transparency: Appeals might undo TDRs made by policymakers and governments; These processes might also underline reckless and voluminous requests made by governments and reduce influence on platform decision-making.

In favor: Transparency around appeals processes might inform the public on the number of internal appeals that end up being litigated in courts.

Potential overarching aim:

Transparency in the third stage ensures that due process is respected. Including litigation in courts following internal appeals acts as an additional check on OSPS.

Transparency goals in this stage:

Create transparency on how platforms treat appeals and internal breach of due process.

It is important to include external appeals in the transparency SOBP in order to emphasize the role of the judiciary in dealing with content moderation.

The concept of “public user” whose audience and reach lessen the privacy requirements to favor more explainability and transparency would be useful to inform the broader community as to what decision the OSP is making.

1.2. Reflection on working group 4’s project

I.Brainstorming

Our group started with an exchange of our individual purposes for the SOBP, i.e. what are we trying to achieve with this SOBP? Some ideas expressed by the group members were: to empower users in general, to respond to a societal interest in understanding why, how, and to what extent content is taken down, to ensure accountability of OSPs, to promote user-generated content and the principle of fair use, and to push for a free internet. When discussing the purpose of the SOBP, members’ answers for the individual brainstorming assignment proved to be quite useful.

We approached the conversation surrounding takedown transparency with both optimism and skepticism. Is sunlight the most powerful of all disinfectants? If it is, how best can light be allowed to shine on something that is darker than others? Who is the sunlight for? These were a few preliminary questions that we kicked off our brainstorming sessions with. We realized that to arrive at “meaningful” transparency, we needed to take into account risks of both underexposure as well as overexposure, while ensuring that if transparency is at a low level, then erring on the side of more sunlight is beneficial.

This exchange on purposes of the SOBP also entailed a discussion of what we were aiming with the data that OSPs are expected to disclose (what we wish to understand from the data) and who the main beneficiaries of the SOBP would be. The group believed that the answers to these questions would help to determine both the scope of the SOBP, e.g. tailoring transparency obligations in accordance, and specifics of best practices.

As regards the goals of the SOBP, the group expressed various ideas, including the following:

  • To understand the current ecosystem of TDRs: what are the problems generated by TDRs? How are takedown mechanisms being misused, why, and by whom?
  • To check whether a fair request for takedown is met by a fair response.
  • To answer questions of request transparency, e.g. how many TDRs are sent and by whom.
  • To answer questions of procedural/decisional transparency: how TDRs are reviewed (e.g. means of review, error rates) and post-decision transparency.

With respect to potential beneficiaries of the transparency afforded by the SOBP, the group briefly discussed the potential “members of the best practices community,” whether the SOBP should have a focus or cater to all, and the pros and cons of both approaches.

These discussions led us to determine our approach for the SOBP. The group discussed different extents of transparency, i.e. maximum, minimum, or “sliding scale” transparency, as well as the pros and cons of each approach. The group seemed to agree upon the contextual, “sliding scale” approach, which would prescribe transparency depending on various criteria, such as the size of the OSP, the beneficiary of the specific transparency obligation (stakeholder), and cases where OSPs decide on the merits of a TDR, as in the case of RTBF. Nevertheless, the group agreed that as a rule, government requests require more transparency, whereas less transparency would be required around sensitive material or vulnerable individuals (that are expected to benefit from the takedown). Furthermore, the group also emphasized the necessity to consider jurisdiction-specific issues.

The group also took note of the limitations of the project: limitations of transparency in general, limitations of existing transparency reporting requirements, and potential limitations of the group’s SOBP. With regard to existing transparency reporting requirements, the group spotted two main issues: lack of granularity required for independent analysis and lack of context to the reports published. Significantly, the group broadly discussed the potential lack of incentives of OSPs to adhere to transparency requirements. The group found that, from a legislative standpoint, disclosure mandates were scant and efforts to investigate the pitfalls of laws like the DMCA and CDA were only being carried out by researchers. Due to the broad latitude extended by such laws, OSPs and social media platforms did not have the adequate incentive to apply due process norms while taking down content. Furthermore, unless backed by sanctions, disclosure mandates triggered fear of damage to reputation and goodwill in case they are exposed to taking down content discriminately, arbitrarily, or disproportionately. Therefore, this “economic asymmetry” problem, i.e. lack of financial incentive for platforms to take user’s rights seriously was a key point we wanted to address in the SOBP, meaning that we aimed to make the SOBP as non-onerous and general as possible while also ensuring that they did not compromise on the core goal of transparency.

During the initial brainstorming, some members conducted a literature review, as recommended by the Sprint Team. This exercise proved to be beneficial for understanding the advantages and shortcomings of existing best practices, identifying gaps, and forming the structure of the SOBP in terms of granularity, methodology, language etc.

II.Forming the SOBP

Following the brainstorming phase, the group came to the understanding that awareness of sufficiently detailed information on existing transparency practices and requirements is a prerequisite in order to build on them and to work out deficiencies. Accordingly, the group examined a number of existing transparency reports, and compared them against each other in order to identify areas where there was scope for improvement. In our attempt to answer the questions mentioned above, the overarching aim was to build on leading best practices such as the Santa Clara Principles On Transparency and Accountability in Content Moderation and the New America Transparency Toolkit. Therefore, while the foundational principles overlap significantly with that outlined in Santa Clara Principles, we found ways in which OSPs could be more meaningfully transparent to specifically users and researchers about content takedown.

Against this backdrop, we moved onto discussing the specifics of the SOBP. In order to delineate the scope, some questions we pondered upon were:

  • Should we focus on quantitative or qualitative transparency, or both?
  • What should be the applicability of the best practices in terms of OSP types, jurisdiction, and so on? In this regard,what are the criteria for classifying OSPs?
  • How granular should the SOBP be?
    • What are the different transparency contexts? e.g., sending notices to users and databases like Lumen, transparency reports, researcher access, internal bodies.
    • Should there be best practices specific to certain categories of takedown requests and if so what are these categories?

Before answering how best to hold OSPs accountable, we mulled upon whether platforms are the right actors to govern the internet. Challenging the adjudicatory role of OSPs was an ambitious endeavor and one that did not squarely fit into the scope of this research exercise. However, the question became a springboard for us to assess other questions such as the limits of their role as an adjudicatory body. The potential discriminatory, arbitrary application of guidelines and rules and the increasing use of automated tools to monitor online content was approached from all possible angles. We found building user trust to be a building block of our SOBP. We emphasized on OSPs to commit to respecting human rights and embed such commitment across all phases of takedown. Such commitment must be assessed, periodically reported and any harm arising from over-judiciousness must be addressed by way of a redressal mechanism. The tracking of the sufficiency of grievance redressal must be addressed and reported and more importantly, it must be made available to relevant stakeholders in an accessible manner.The accountability to this commitment requires OSPs to abide by such foundational principles both procedurally as well substantively and to that end, the report has tried to be as granular as possible and erred on the side of maximum transparency, while taking into account certain risks of over-transparency. This paved the way for us to create a tripartite model: general, operational, and phase-wise best practices.

III.The Tripartite Model: General, Phase-wise, and Operational Best Practices

The Tripartite model, as we envisioned it, was surprisingly similar to how Working Group 3 had approached their work. Our aim was to segregate the TDR process into buckets that could then be examined separately to assess transparency best practices. We drafted the general and operational best practices to be applicable across all phases of the takedown process, with the aim of ensuring comprehensive, granular, and reader-friendly reporting. The phase-wise best practices are aimed at tackling each stage of the takedown process separately. Our phase-wise approach focused on (1) Transparency on the TDRs themselves; (2) Transparency on the decision-making procedure on how the TDR is handled; and (3) Post-hoc transparency, or transparency on practices after the decision on the TDR has (and hasn’t) been made by the OSP.

Being cognizant of the fact that a one-size-fits-all approach may not be useful for our purposes when working with OSPs that offer different products and are of different sizes, network effects and so on, we strived to design the SOBP to address these differences. Accordingly,we have marked out transparency requirements that may be followed by OSPs of larger sizes. We also acknowledged that the general body of users do not require the kind of comprehensive data that researchers would appreciate. For this reason, we have marked out, separately, the information that could be made available only to certain selected researchers.

The tripartite model found synergy with Working Group 3’s three-tier model and merging the two was a seamless exercise as both the teams approached transparency through a stakeholder-specific lens and worked with similar foundational principles. Their three-tier approach focused on pre-removal, during and post-removal phases, and carved out transparency goals for each tier. Our working group adopted a similar structure as the three-tier model was subsumed by the phase-wise best practices, with the difference being that while Working Group 3 deliberated on specific goals for each stage, our working group focused on addressing specifically two overarching objectives: process of decision-making and the decision itself (i.e. the process and the decision transparency) and aimed to get OSPs to reveal as much information as possible about the procedure, the means taken to apply said procedure, and the end-decision arrived after applying said procedure. Therefore, both working groups’ ethos were similar, the pillars used to construct the SOBP were also similar while, metaphorically speaking, the bricks that made up the pillar differed without contradicting each other. Accordingly, the merging was a coordinated exercise of discussing which bricks to place under which pillar and why. Additionally, our working group aimed to incentivize OSPs, especially large-scale OSPs, to provide granular information on takedown while ensuring that small-scale OSPs were not negatively impacted by the onerous nature of obligations.

With regard to methodology, our working group decided to refer to existing good practices of OSPs where possible.

IV.Lessons Learned

Our research and analysis of existing transparency practices and requirements led us to the understanding that any attempt to draft an SOBP is incomplete without broad, meaningful stakeholder engagement. We found that this was all the more true as our best practices were informed by a human rights-based and pluralistic perspective. Although we strived to be intentional with each best practice, we also concluded that close engagement with stakeholders is indispensable to do so. This requires being thoughtful about who the constituents of the “best practices community” are and may be in the future. Significantly, any stakeholder engagement must include OSPs starting from early on and ideally, enable active communication with other stakeholders. For instance, in order to assess the efficacy of and overcome challenges regarding existing best practices, as well as to identify and bridge gaps, we felt the need to know whether and to what extent OSPs make use of these best practices and get granular feedback. The question of incentives for OSPs to adopt the SOBP, which we grappled with throughout our work, could also be addressed through such engagement.

Another key lesson has been that we need diversity in all aspects; diversity in stakeholders who have varying interests and expectations is a must. A major takeaway in this regard has been that risks arising from over-transparency must be thoroughly investigated. We believe that one way of doing this is through stakeholder input. Diversity in OSP types is also critical, which we addressed by marking out transparency requirements for OSPs of larger sizes. However, we believe that criteria other than size should also be considered in future efforts. Finally, introspectively, we believe that the diversity of the team in charge of preparing the SOBP is also crucial. Specifically, in our experience, we found that technical expertise, e.g. in the context of automated review, along with experience from the “field,” e.g. civil society/activist experience, is essential.

V.Unresolved Questions and Looking Ahead

Given the time and resource constraints, there are some issues that the group was not able to address or could do so only partially.

One pressing issue is the incentives for OSPs to adopt the SOBP, which is crucial to ensure its widespread adoption. The group observed that while OSPs have various incentives to comply with TDRs and whatever the underlying motivations may be to do so, they are also inclined to be transparent about takedowns, especially if a takedown originates from external requests. For further work, the group would ask if this is in fact the case and if so, what are the incentives? How can we make use of these incentives to shape the best practices so as to attain broad OSP buy-in?

Another crucial matter, which was also prompted by the Research Sprint team, is ensuring the applicability of the SOBP across jurisdictions. As much as the group strived to address this through adopting a general approach that was not focused on any one jurisdiction, the group would question if a jurisdiction-neutral transparency model is even possible/feasible and ask how widespread adoption by OSPs can be attained given potential clashes between the voluntary best practices and legal obligations. In this context, the group would also pose broader questions of ensuring the flexibility of the SOBP as well as “future-proofing” it considering the evolving tech and the dynamics between actors.

In addition, the group’s attempt to address specific challenges arising from the use of automated systems felt incomplete due to the black-box nature of AI systems. During our research, we realized that the challenges faced by small-scale OSPs and large-scale OSPs were vastly different, i.e. the type of errors made by human reviewers are not the same as that by automated tools. However, a lack of access on how automated models take down content or what different models exist to perform this limited our research on large-scale OSPs. Going forward, we would explore scholarly work in this area and flesh out the specific transparency obligations to make automated takedown mechanisms more transparent.

Finally, the group would like to note some aspects that should be dealt with in further detail:

  • Each best practice should specifically address relevant limitations and trade-offs and provide guidance on how the latter can be resolved.
  • Where applicable, best practices should incorporate considerations for different categories of TDRs (e.g. government requests) and content/issue (e.g. RTBF, CSAM, government secrecy). In relation to TDR types, the group makes note of the necessity of further research regarding the impact of network shutdowns on the takedown ecosystem.
  • Where applicable, best practices should be further refined according to how much transparency would be afforded to whom. Accordingly, the SOBP should recognize audiences other than researchers.
  • Future efforts should build on our initial work regarding takedown abuse and safeguards. Specifically, retaining content that has been takedown and ensuring researcher access thereto, as well as transparency on rejected TDRs should be further investigated. With regard to the former, the group makes note of digital repository efforts.[6]
  • Further research should be conducted to determine criteria for “Large Scale OSPs” and the extent of “Researcher Data.”

2.Post-mortem improvements: what’s next?

Our work has focused on the overarching values necessary to establish a common framework for transparency. Our work leaves pending a wide array of questions regarding the adaptability to quickly evolving platforms and changing moderation rules in jurisdictions. Specific statements of best practices for platforms ought to address relevant limitations/trade-offs and provide guidance on how they can be resolved. It is essential to underscore that specific SOBP will differ on how much transparency would be afforded to whom, notably regarding the scope of transparency under each category could be crafted according to the relevant audience. As a result, we ought to get into the specifics of creating a data taxonomy and attributing each specific data to determined actors given their interests. Moreover, our SOBP did not interrogate automated takedowns (method, programming, results) in as much detail as we would have liked due to a lack of technological capability, it is something that ought to be tackled in the future. Finally, our work could be confronted with transparency regulations mandated by states and would benefit from participation and feedback from stakeholders.


Annex: Glossary of terms

General terms

  1. Online service providers:Online service providers (OSPs) include a broad range of entities that provide electronic communication services or remote computing services[7]. Thus, entities providing network access, peer to peer messaging, email, online news broadcasting , online video or audio streaming, search engines, e-commerce, online banking, social media, etc are classified as OSPs.
  1. Commercial users: A commercial user uses the service of an online service provider to generate economic gain. Included in this category are for example social media influencers, and third-party resellers on online marketplaces. Excluded are individuals occasionally selling their products by the means of online service providers.
  1. Trusted flaggers:A trusted flagger is a status accorded to an individual or organization by an OSP for having valuable expertise or experience in content moderation. It can come with access to additional resources and responsibilities in the content moderation ecosystem of that OSP.
  1. Community standards: Community standards refers to the terms of service applied by online service providers. In practice, a variety of different terminology is applied, including user terms, user agreements, community standards, policy guidelines, and house rules. These community standards include the platform’s policy on content moderation and describe which content is allowed on the platform’s service.

Stage 1

  1. Opportunistic takedown requests: An opportunistic takedown request refers to the practice where trusted flaggers or third party notifiers send a takedown request to an online service provider without certainty that the content is infringing.
  1. Large Scale OSP: Refers to large-scale platforms with social functions. In German case-law (Bundesgerichtshof, III ZR 179/20 and III ZR 192/20) a distinction is made between social media platforms that significantly impact users’ social life and those who do not. As a result of the third-party effect of fundamental rights related to this ‘social function’, the former are subjected to stricter procedural rules on content moderation.

Stage 2.In correspondence withTransparency on OSP takedown procedure (Decision-making visibility)

  1. Procedural transparency: An important component of transparency is open decision–making such that the processes/ tools used to arrive at decisions are subjected to oversight. Procedural transparency in the context of OSPs’ content moderation decisions will include disclosure of information on internal institutions, resource allocations and processes followed to arrive at takedown decisions.
  2. Editorial transparency :Editorial transparency means disclosing information pertaining to the OSP(s)’s editorial decisions and operations. It can include information on the platform’s internal editorial policies that determine what content can be taken down, information on the algorithms and efficacy of automated tools that are used in such decisions, justifications and explanations of specific decisions or even disclosing aggregate numbers about editorial decisions.
  3. Metadata of content: Metadata provides additional information about the content under a takedown request. This can include information on the level of engagement, date of posting and removal, demographic information of the content author, geographic and demographic information detailing content engagement etc. This data can be of special interest to researchers.

Stage 3. In correspondence with Phase 3: Transparency on the takedown decision: Decision visibility

  1. Public user: A public user is a user who benefits from large audience or engagement on the platform and creates viral posts on a regular basis or a public figure such as a government official, a political figure, or anyone benefitting from a broad reach because of their achievements in the arts or in business in a certain country or region.
  1. Meaningfully public data : Meaningfully public data is data related to a public user which can be subject to less privacy scrutiny to favor explainability and accountability for decisions affecting content reaching out to a large audience.



[1] In one case, “the [requesting] individual’s site had been created and then back-dated for the purpose of filing this takedown request.” https://transparencyreport.google.com/copyright/overview?hl=en

[2] See, e.g., https://transparency.twitter.com/en/reports/removal-requests.html#2021-jan-jun.

[3] See https://transparencyreport.google.com/eu-privacy/overview?hl=en.

[4] See Department of Commerce DMCA Multistakeholder Forum: DMCA Notices and Takedown Processes, I(A)(7).

[5] See Council of Europe Recommendation on the impacts of digital technologies on freedom of expression, para. 4.5.

[6] See https://humanrights.berkeley.edu/sites/default/files/digital_lockers_report5.pdf andhttps://s3.amazonaws.com/kfai-documents/documents/02017b49c2/4.12.2021-Bowers-2.pdf.

[7] See https://www.eff.org/wp/osp

------------------------------------

[Return to Sprint Overview]