Together with the Digital Services Act Civil Society Coordination Group, we are calling on online platforms to ensure meaningful transparency of risk assessments that they conduct on an annual basis under the Digital Services Act (DSA). In the statement, we specify the necessary information that platforms should publish for external stakeholders, including civil society, researchers and journalists, to understand how platforms identified and mitigated risk.
Under the Digital Services Act, the largest online platforms have to assess and mitigate – on a yearly basis - systemic risks stemming from their services, including risks to fundamental rights. Each risk assessment has to be subsequently evaluated by an external independent auditor. At the latest three months after receiving the audit report, platforms have to publish the results of the risk assessment, how platforms mitigated identified risks, the audit report, information on how auditors’ recommendations were implemented and whether platforms consulted with civil society or other external stakeholders during the assessment process. If platforms and auditors have not exceeded various deadlines imposed by the DSA, we should expect to see the documents related to the first round of risk assessments (conducted in 2023) by 25 November 2024.
Why are meaningful risk assessments important?
Risk assessments and ensuing actions taken to mitigate identified risks are a crucial preventative safeguard in the DSA which – if implemented meaningfully – can help protect social media users and broader society from censorship, privacy violations or threats to their rights to use social media to exercise their freedoms of assembly and association.
In 2023, together with Access Now, we published a report detailing our expectations towards online platforms as for the questions to be addressed in a risk assessment to ensure they are not merely a box-ticking or compliance exercise, but play a key role in helping platforms meaningfully and actively protect human rights. In this report, we also included key expectations related to transparency of the risk assessment findings.
Building on our recommendations from 2023, we collaborated with partners in the DSA Civil Society Coordination Group to clarify and further develop concrete asks for platforms ahead of the expected publication date. The information we expect platforms to publish include, for example:
- A detailed methodology of the risk assessment, including the platforms’ understanding of “systemic risks”;
- A detailed description of the assessed service including any algorithmic or advertising systems, which will help external stakeholders understand how platforms function and what risks can stem from the design of their various functionalities;
- The identified risks and to which functionalities or parts of the platforms they are linked;
- Mitigation measures already taken or to be taken as a result of the risk assessment;
- Which internal teams, departments and senior management was involved in the risk assessment;
- Which external stakeholders' platforms consulted in the process;
- How the audit informed the measures adopted by platforms to mitigate risks.
With the publication of documents expected towards the end of November, we will be able to verify whether platforms took the risk assessment process seriously and whether the information they share will allow civil society to keep them to account. Based on this review, we will engage with the European Commission and with the platforms themselves to discuss potential gaps in risk assessments and how this process should be improved in the future.