Edition: International | Greek
MENU

Home » Analyses

Here’s how the Digital Services Act changes content moderation

Platform governance has been at the forefront of the tech and media world recently, given the rise in cyberbullying, hate speech and other harms online

By: EBR - Posted: Thursday, December 15, 2022

The Digital Services Act harmonizes the process by which platforms are notified and must take subsequent action on illegal content. More concretely, once notified by trusted flaggers, platforms will have to remove illegal content ‘expeditiously’.
The Digital Services Act harmonizes the process by which platforms are notified and must take subsequent action on illegal content. More concretely, once notified by trusted flaggers, platforms will have to remove illegal content ‘expeditiously’.

by Akash Pugalia and Farah Lalani*

Platform governance has been at the forefront of the tech and media world recently, given the rise in cyberbullying, hate speech and other harms online. With the UK recently dropping ‘legal but harmful’ clause in its Online Safety Bill, many are wondering how new regulation across the globe will evolve to impact digital safety? The Digital Services Act (DSA) is of particular interest.

With the ethos of ’what is illegal offline should be illegal online’, the DSA aims to provide clearer and more standardized rules for large and small digital service providers across the European market. In particular, the DSA will regulate how platforms moderate content, advertise, and use algorithms for recommendation systems. This will apply not just to large platforms but also to small businesses, across online marketplaces, social networks, app stores, travel and accommodation platforms, and many others.

Preparing for Digital Services Act enforcement

Currently, there are more than 10,000 platforms in the EU, 90% of which are small and medium sized enterprises. But with 27 different sets of national rules, the cost of compliance could be prohibitive for small businesses. The DSA aims to ensure that small online platforms are not disproportionately affected but that they remain accountable. So, what are some of the areas that companies may need to consider in preparing for the enforcement of the Digital Services Act? Below are some areas to keep in mind:

Content Removal

The Digital Services Act harmonizes the process by which platforms are notified and must take subsequent action on illegal content. More concretely, once notified by trusted flaggers, platforms will have to remove illegal content ‘expeditiously’. The DSA also stipulates that users are informed about, and can contest removal of content by platforms, having access to dispute resolution mechanisms in their own country.

While the Digital Services Act doesn’t have specific timelines for content removal, companies need to be prepared for quick removal and have the right processes and capacity in place in order to act on notifications from trusted flaggers. In addition, if platforms are not currently providing explanations to users about their removal decisions, this process will need to be instituted across the board.

Specific pieces of information are required in this explanation such as whether the action involves removal or demonetization, whether the action was in response to a notice submitted or based on voluntary own-initiative investigations, whether automation was used in the decision, a reference to the legal ground for illegal content or the community guideline for a policy violation, and redress mechanisms available to the user.

Proactivity

The Digital Services Act makes it clear that platforms and other intermediaries are not liable for users’ unlawful behaviour unless they are aware of illegal acts and fail to remove them. This shielding of legal liability aims to encourage companies to be more proactive when moderating the content on their platform. Only if the flagged content is evidently manifestly illegal can such notices give rise to ‘actual knowledge’. According to the text of the Digital Services Act (section 63), “Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal or, respectively, that the notices or complaints are unfounded.”

This helps solve the potential disincentives around voluntary measures taken by small platforms; the Digital Services Act reinforces that diligent platforms are not liable for illegal content they detect themselves. Small platforms can now use this as a means to further invest in robust content moderation practices that can help act as a competitive differentiator in the market, without worrying about potential legal implications.

Notice and Action

The Digital Services Act highlights the need for a notice and action mechanism, covering the creation of complaints system and how these are actioned. Notices flagging purportedly illegal content must be sufficiently precise (section 22); what this means is that providers should be able to determine illegality without needing a substantive analysis or legal review.

When actioning these notices, platforms need to ensure that responses are targeted so that fundamental rights, including free expression, and others outlined in the EU Charter of Fundamental Rights, are upheld. The Digital Services Act also foresees measures against misuse which allows online platforms to potentially suspend individuals who misuse the notice and action mechanism and/or complaint handling systems, as long as the decision to suspend is assessed on a case-by-case basis.

Transparency and due diligence

Increased transparency is a theme that runs throughout the Digital Services Act. Whether this be more transparency around how to report illegal content, the explanations around content removal, the terms and conditions for platforms, the role of algorithms in recommending content, and much more.

Companies will need to be comprehensive, systematic, and diligent in their transparency efforts related to content moderation and the decisions surrounding this. There is a requirement for providers of hosting services to notify law enforcement of any suspicion that a criminal offence involving a threat to the life or safety of a person or persons has or is likely to take place. From an e-commerce perspective, online marketplaces can be required to trace their traders under the “know your business customer” principle and any dark patterns in order to better track down sellers of illegal goods or those engaged in manipulative practices. From an advertising perspective, greater transparency on ad targeting (who sponsored the ad, how and why it targets a user) as well as ban on certain targeted adverts are required. Moreover, clear information on why content is recommended to users will need to be provided.

These are just some of the areas companies need to be aware of when thinking about their trust and safety strategy ahead of the Digital Services Act. The European Commission has provided a summary here of all new obligations and who they apply to.

How will the DSA be enforced and what does it mean for digital service providers?

As an EU Regulation, the Digital Services Act is directly applicable in every member state; enforcement will be split between national regulators and the European Commission. As Commissioner Thierry Breton outlined in his sneak peek on enforcement of the new legislation, the Directorate-General for Communications Networks, Content and Technology will play a key role in enforcement, having control over due diligence obligations including risk assessments, independent audits, data access, amongst other areas.

Each EU Member state will assign a national authority to the role of Digital Services Coordinator (DSC), which will be responsible for all matters related to supervision and enforcement of the Digital Services Act at the national level. Many countries are tasking existing audio-visual media regulators to fulfill the role of DSC but are also considering assigning specific tasks to electronic communications regulators, consumer protection authorities, or other such relevant bodies.

Given that enforcement is expected to be seamless and equally enforced throughout Europe, companies should be taking steps now to understand the requirements in detail and shore up the trust and safety capabilities required to comply. But, compliance with the Digital Services Act will serve as the new minimum safety baseline, and companies will need to think even more proactively about their platform policies, enforcement, use of automation vs. people, and closed-loop feedback mechanisms to get ahead of new risks to come.

*Global President, Trust & Safety, Teleperformance and Global Vice President, Trust and Safety, Teleperformance
**first published in: Weforum.org

READ ALSO

EU Actually

Respite for Wikileaks founder Assange

N. Peter KramerBy: N. Peter Kramer

Wikileaks founder Julian Assange can stay in the United Kingdom for at least another two months

View 04/2021 2021 Digital edition

Magazine

Current Issue

04/2021 2021

View past issues
Subscribe
Advertise
Digital edition

Europe

From abortion rights to assisted dying: Macron’s 180-degree shift

From abortion rights to assisted dying: Macron’s 180-degree shift

In the latest episode of our Today in the EU podcast, we are looking at how European elections have impacted French President Emmanuel Macron’s policy choices

Business

Artificial intelligence and competitiveness in the retail sector

Artificial intelligence and competitiveness in the retail sector

The importance of AI and machine learning in the retail market is confirmed by the projected dramatic growth of AI services worldwide, which will skyrocket from $5 billion to $30 billion by 2030

MARKET INDICES

Powered by Investing.com
All contents © Copyright EMG Strategic Consulting Ltd. 1997-2024. All Rights Reserved   |   Home Page  |   Disclaimer  |   Website by Theratron