
On 26 June this year, the Brazilian Supreme Court concluded a ruling on the country's Civil Rights Framework for the Internet (MCI, the acronym in Portuguese) – a pioneer legislation enforced 11 years ago to define principles, rights and duties for internet use in the country. More specifically, the court was assessing the constitutionality of Article 19, which focuses on the accountability of platforms for damage resulting from content posted by users.
APC interviewed Flávia Lefèvre and Oona Castro from Instituto Nupef, one of our founders and members in Brazil, to understand the pros and cons of the recent ruling, the pressures emerging from the rise of the far right in the country and around the world, and possible paths to safeguard digital rights and hold big tech accountable, considering the solid history of resistance by Brazilian civil society organisations.
What would Nupef highlight as positive and/or negative in the changes wrought by the recent ruling of the Brazilian Supreme Court?
The Supreme Court is the one in Brazil that has legal authority to deal with constitutional rights and has the final say on analyses regarding the constitutionality of provisions of ordinary laws. This Article of the MCI establishes that internet application providers cannot be held liable for content produced by users/third parties, unless they resist compliance with a specific court order determining the removal of the content.
The objectives of Article 19 are, as stated in its wording, to prevent prior censorship and guarantee freedom of expression. It is the materialisation of one of the internationally accepted principles of internet governance – that of the non-imputability of the network.
The logic behind Article 19 was that, in order to prevent platforms from arbitrarily taking down content based on criteria linked to their commercial and profit interests, they should not be held liable for content published by their users. So, the legislator wanted to ensure that the legality of a given piece of content would be analysed by the judiciary, without the contamination of private interests. It was understood back then that holding platforms liable had the potential to limit political debate and freedom of expression in what was then the “new” public arena for discussion.
So civil society has argued that content applications should not be held liable for what is posted by users, except in cases of obvious crimes, believing that this would promote an environment with greater freedom of expression – in opposition to initiatives that were then emerging in the National Congress to penalise and criminalise various uses of the internet in the 2000s.
However, in recent years, freedom of expression has become mixed with misinformation, and platforms have done little, if anything, to mitigate the proliferation of misinformation online. On the contrary, their algorithms often encourage this type of content. In view of this, part of the progressive sector and civil society began to argue that platforms should have the power to remove content without a court order.
The recent court discussion regarding platform liability, however, ignored other provisions expressed in the MCI, which holds platforms liable for their own actions under Brazilian law.
The development of content moderation algorithms, calibrated to expand the reach of content that violates the law, as well as the paid promotion of content that encourages illegal discrimination, hate speech, among other illegal acts, and even the recommendation of illegal content, are acts of the platforms themselves, which bring them profit and, therefore, these companies, under existing laws, must be held liable, together with the author of the content, for damage caused by their commercial practices.
In other words, the liability regime of the Brazilian Civil Rights Framework for the Internet distinguishes between liability for acts of third parties (Article 19) and liability for acts specific to platforms (Article 3). However, the Supreme Court ignored this distinction and, in its ruling on the constitutionality of Article 19, determined the partial constitutionality of the provision, stating that it remains effective for crimes of honour – slander, libel and defamation.
In addition, it expanded the list of cases in which the platform must exercise what it called a duty of care, inspired by European Community legislation, requiring it to remove content either when prompted by extrajudicial notification or when, in its judgment, illegal content is present.
While it is clear that it is necessary to regulate the operation of platforms and commit them to combating misinformation, Nupef has concerns about the power given to companies that own content applications to define what can and cannot be removed/published.
What paths exist to promote platform accountability while minimising the risks of censorship and violations of true freedom of expression? And how can we build these paths by dispelling the myths created to misuse the idea of freedom of expression in order to prevent any form of regulation and accountability?
Brazilian legislation already holds platforms accountable, as is the case with our Consumer Code, Electoral Law, and Statute of Children and Adolescents, which could and should have been enforced long ago, as we explained in the previous answer.
The regulation of platforms, regardless of the discussion on liability, should focus on their algorithmic practices, transparency, mandatory periodic reporting on the criteria used for content removal and account suspension and their results, as well as the availability of applications that allow regulators and researchers access to data related to these practices.
In addition, it would be necessary for the competent public authorities for consumer protection, competition protection, among others, to actually enforce the laws that already exist and are not respected by platforms. For example, there are thousands of cases of illegal content circulating on social networks, affecting children and adolescents in particular, especially now with the advancement of Artificial Intelligence.
The same can be said of ideological and political debates, marked by misinformation, which have already caused concrete damage to the country's democratic institutions such as the National Congress and the Federal Supreme Court itself, as we saw with the attempted coup d'état that culminated in the attacks in Brasília on 8 January 2023, which aimed to prevent the president elected in 2022, Luis Inácio Lula da Silva, from taking office.
What are the next steps? Why is it important to have a regulatory law in the National Congress in addition to the court ruling?
The current situation in Brazil, with the configuration we have in the Legislative Branch, with the preponderance of right-wing or ultra right-wing parliamentarians, is a challenge for us to pass a law, along the lines of a bill proposed in 2020 (PL 2630/2020), which contained a widely debated proposal but was shelved, due to, among other reasons, lobbying by the platforms, welcomed by the then president of the Chamber of Deputies.
After the election of Donald Trump in the US, the challenge is even greater, [as] he is acting in defence of the interests of platforms in order to avoid regulation of their activities, and is imposing sanctions on Brazil with express mention of the issue of Big Tech, which supported and financed his election.
What aspects of the Brazilian process could serve as inspiration for other countries or in the international debate on platform accountability?
I believe that the lesson we can learn from these recent processes involving the legislative and judicial branches is that the struggle for transparency, accountability, and respect for freedom of expression is deeply related to the political environment.
This points out to the need of organising ourselves to confront neoliberal ideals and the far-right movement that supports them using social media, than through regulation itself, which, due to technological complexity and the high concentration of digital services in the hands of US companies, is deeply compromised in the current context, not only in Brazil but also worldwide.