|

Add a bookmark to get started

29 de marzo de 20237 minute read

DSA: A New Era for Online Dispute Resolution – with an Irish twist

There are many perspectives on the challenges of dealing with illegal content online. Complainants have had to weigh up if reporting a complaint is even worth the effort given the time, energy and cost involved without any guarantee of a satisfactory or even timely outcome. The alternative has only been recourse to a national court to protect interests raising practical concerns of cost and effective access. Online service providers on the other hand have had to deal with complaints that often do not clearly identify the content at issue and are vague as to the reasons why the content at issue is unlawful, yet if no action is taken create a real risk of a significant potential liability.

We now stand on the cusp of a new era that tries to address these issues and which will impact how online service providers manage liability for user content.

By way of context, the EU Digital Services Act (DSA) will soon regulate a vast range of online users and service providers including intermediary services, hosting services and online platforms. One of its main objectives is to protect online users by stopping the online spread of illegal content and providing users with a new range of complaint mechanisms. Most of the new rules will apply from 17 February 2024 – and platforms are expected to need to make significant IT and procedural changes.

In this update, we discuss one specific aspect of the DSA - the new internal complaint handling system and out-of-court dispute resolution mechanism introduced by the DSA to deal with disputes between online platforms and their users. We also look at parallel developments in Ireland – which recently adopted the Online Safety and Media Regulation Act (OSMR) and established a new Media Commission (see here).

 

Internal complaint-handing

The new dispute resolution rules apply to businesses such as online marketplaces, social media platforms and content-sharing platforms. These platforms will need to provide users with access to an effective internal complaint-handling system that enables users to lodge complaints, electronically and free of charge, against a content-related decisions. For example, such disputes may involve a platform’s decision to remove content posted by users, a decision to suspend service to a user, or a decision to suspend or terminate a user’s account. Online platforms will need to adapt existing internal complaint mechanisms, or more likely build new ones that comply with the DSA.

The DSA has set particular standards for these new complaint-handling systems. For example, they must be easy to access, user-friendly and must facilitate the submission of precise and substantiated complaints. They must also operate in a timely, non-discriminatory, diligent and non-arbitrary manner.

Once a complaint is reviewed by an online platform and a decision is made, the complainant must be informed promptly of any content-related decision. The decision must be reasoned and must be made by qualified staff, not solely based on automated means. Online platforms must also inform users of the possibility to access an out-of-court dispute settlement body and other available possibilities for redress.

 

Out-of-Court Dispute Settlement

For complaints that are not resolved by the internal complaint-handling system, users may choose to avail of a new dispute settlement process. The DSA empowers the creation, certification and recognition of new out-of-court dispute settlement bodies (OOC bodies) through which users can attempt to resolve disputes over content-related decisions without going to court. While these OOC bodies will not be able to impose a binding solution, the DSA requires both user and platform to take part in the dispute settlement process.

OOC bodies will be certified by the Digital Services Coordinator of the EU Member State where the OOC body is established. In Ireland, the Media Commission has been designated as the Digital Services Coordinator and an individual is to be appointed to the role. OOC bodies will need to satisfy certain conditions in order to be certified. For example, they will need to demonstrate that they are impartial and financially independent of online platforms and that they have the necessary expertise in respect of illegal content. This is an interesting feature of the DSA as the Digital Services Coordinator in any given Member State will ultimately be responsible for determining how to apply the relevant criteria for certification in the relevant EU Member State. The requirements for certification as OOC bodies may differ between Digital Services Coordinators in different jurisdictions and these differences will only become apparent once local Digital Services Coordinators begin carrying out their functions.

In practice, it is expected that users will request OOC bodies to review disputes over content moderation decisions. For example, if an online platform suspends the account of a user or removes a specific piece of content, that user can then ask one of the OOC bodies to initiate a dispute settlement process concerning the platform’s decision. In theory, the OOC bodies will resolve disputes in a swift, efficient, and cost-effective manner through their decision-making role, however, it remains to be seen how effective and efficient the process will be as there is no prescribed procedure to be followed by OOC bodies and there is no power to impose binding decisions on the parties.

 

What we think is new – and why Ireland is different

At one level, the EU-wide DSA and OSMR in Ireland pursue similar policy goals. Both the DSA and OSMR provide users with important legal rights to input on online content moderation decisions. The DSA and OSMR provide users with options to challenge online content decisions. Many platforms will fall under the scope of both regimes. Indeed, many platforms have already established appeal mechanisms for their users allowing for a second review of content moderation decisions. Some online platforms have even set up independent bodies to issue decisions on high profile content moderation cases. Facebook’s Oversight Board, for example, has the power to make decisions to uphold or reverse Facebook’s content decisions and the decisions are binding on Facebook. The distinction between some of these independent bodies and the new OOC bodies is that users cannot currently require a review of their particular case as the existing bodies only deal with a selection of representative cases.

However, the DSA and OSMR also diverge in central aspects of content moderation and complaint handling. While the DSA regulates ‘illegal content’, the OSMR also brings ‘harmful content’ within its scope. Given the parallel complaints mechanisms, we anticipate potentially complex jurisdictional and procedural issues between the parallel dispute resolution mechanisms, and the precise role for the OSMR’s online safety codes.

In particular, the OSMR provides significant powers to the Media Commission to make binding online safety codes to deal with a range of matters, including standards relating to content moderation, assessments by service providers of the availability of harmful online content and the handling of user complaints. The Media Commission will need to ensure alignment between the standards in complaints handling set out in online safety codes and the relevant requirements for internal complaints handling systems of providers of online platforms set out in the DSA. Indeed, the Media Commission has noted that it will prioritise codes governing online content relating to children (presumably as this was a significant driver of the political motivation to introduce these measures in Ireland).

Overall, we expect that OOC bodies may well provide a highly favorable and cost-efficient alternative for users who wish to have content related decisions overturned. Notably, if an OOC body makes a decision in favour of a user, the online platform must bear the costs and will have to reimburse the user for reasonable expenses. Conversely, if an OOC body decides in favour of the online platform, the user will not be required to reimburse any fees or expenses unless the user manifestly acted in bad faith. Given the nominal fees and risks skewed in favour of users, online platforms may well be concerned about exposure to a significant volume of cases being referred to OOC bodies to resolve disputes.

Print