Complying with DSA’s Transparency Reporting Obligations

Published on April 30, 2024
Last Updated on April 30, 2024

Wasbir Hazarika

Policy Lead,
Trust & Safety Practice

Sarah Edwards

Manager,
Wellness + Resiliency

In today's digital landscape, trust is currency. Users want to know how platforms handle their  data and moderate content, giving them a sense of control and safety. Transparency reporting helps digital platforms maintain accountability and build trust with their users.

The Digital Services Act (DSA) of the European Union mandates companies to adhere to hosting services obligations to establish a safe digital environment that protects users' fundamental rights and provides a fair playing field for businesses. These obligations include the release of transparency reports.

What should be included in transparency reports?

These reports detail how a platform manages and moderates content users interact with regularly. The DSA has set different annual report requirements for businesses providing content moderation services based on their types. Note that these are merely guidelines and have not been adopted yet.

Starting February 2024, all intermediary services except micro and small enterprises must publish annual reports about their content moderation practices. These reports, accessible to the public, will shed light on how platforms handle various aspects of content management:

Responding to authorities: Indicate the number of requests from Member States to remove illegal content, categorized by type and average response time.

User-initiated actions: Disclose the number of notices received regarding illegal content, report actions taken, and provide average response times to help users understand how their reports are handled.

Proactive moderation: Disclose their content moderation practices, including the number and type of measures taken to control information visibility.

Complaint resolution: Report their content moderation procedures, including complaints received, reasons, decisions, resolution times, and reversals to build accountability and trust in complaint resolution.

In addition to intermediary services requirements, online platforms must include the following in their reports:

Dispute settlement: Reveal data on alternative dispute resolution procedures, including cases, outcomes, and average resolution time to enhance transparency on mechanisms utilized outside court.

Suspensions: Disclose the number of suspensions for different reasons, such as illegal content, unfounded notices, or complaints, to provide insights into the types of violations leading to platform intervention.

Automated moderation: Report the purposes of automated tools for content moderation, their accuracy indicators, and controls applied to mitigate potential biases.

Online platforms must disclose information about their average monthly active users in each Member State at least twice a year. This metric offers transparency into the platform's reach and helps determine whether it qualifies as a Very Large Online Platform (VLOP). Per the DSA guidelines, platforms with over 45 million average monthly active users in the EU are designated as VLOPs.

VLOPs face the most stringent reporting requirements:

Conduct annual risk assessments: Analyze and report potential systemic risks stemming from their services, such as the spread of illegal content, impacts on fundamental rights, and intentional manipulation.

Implement and report mitigation measures: Based on identified risks, VLOPs must detail the mitigation measures, such as adapting moderation systems or collaborating with trusted flaggers. This transparency shows their commitment to addressing potential harms.

Undergo independent audits: VLOPs must be audited annually by independent organizations, evaluating their compliance with DSA obligations and commitments made in codes of conduct or crisis protocols.

How are DSA transparency reports structured?

While the DSA’s reporting obligations are already outlined, these reports' specific formats and structures are still under development.

The European Commission has drafted two templates for reporting:

Quantitative Template:

Focuses on numerical data related to content moderation actions for different categories, such as:

  • Orders from authorities to remove content (categorized by content type)
  • User notices about illegal content (categorized by content type)
  • Platform's own content moderation actions (categorized by type and reason)
  • Complaints about content moderation and their outcomes
  • For online platforms only:
    • Out-of-court dispute settlement cases
    • Suspensions for different reasons
    • Use of automated moderation tools

Qualitative Template:

Requires a narrative explanation of:

  • The service provider's content moderation policies and procedures
  • The controls employed to protect fundamental rights and freedoms
  • Collaboration with trusted flaggers and other actors
  • Measures taken to address specific risks identified through risk assessments (for VLOPs only)

Transparency reports also allow stakeholders to scrutinize content practices and advocate for responsible moderation. This mitigates potential risks like biased algorithms or misuse of user data, builds stronger relationships with regulators, and protects businesses against costly fines.

TaskUs: Your compliance partner

TaskUs is dedicated to helping you adapt to the digital world's ever-changing requirements, including the critical intricacies of DSA transparency reporting. We’ll simplify the DSA's complex requirements and provide outsourced content moderation solutions to empower you to confidently achieve compliance.

By combining AI-powered solutions and our exceptionally skilled teams, we’ll help you:

  • Assess and understand your regulatory obligations at an operational level
  • Identify and assess the risks stemming from the design or functioning of your service and deliver risk assessments when required 
  • Review your existing content moderation processes, plan possible outsourcing solutions, spot the gaps with your regulatory obligations, assess the criticality and effort level for each gap, and lay out a mitigation plan that takes these factors and your available bandwidth into account
  • Scale and provide workforce across different countries and time zones
  • Provide an experienced team to test the implementation of pilot projects

With our expertise and innovation, compliance will become a strategic advantage for your business.

  • 1^Digital Services Act – transparency reports (detailed rules and templates)
Navigate the DSA and ensure compliance

References

TaskUs