Advancing human rights and stakeholder engagement in gaming and extended reality

31-03-2023
Discussion at the Game Developers Conference on what companies can do to make their products and platforms safer for users and civil society.
Image
panel of speakers sitting in front of audience at the ECNL - Freedom Online Coalition welcome breakfast at Game Developers Conference

Who would’ve expected that the 2023 Game Developers Conference (GDC) in San Francisco becomes a place where diplomats and policymakers (in white sneakers!), civil society, gamers, developers, and other industry representatives gather over coffee and croissants?

On a rainy day in San Francisco, ECNL and the Freedom Online Coalition Silicon Valley Working Group (currently chaired by the Consulate General of Canada in San Francisco) co-organised a welcome breakfast to kick off GDC 2023 on March 21. The event featured prominent experts Lindsey Andersen from Business for Social Responsibility and Jenni Olson from GLAAD, an LGBTQIA+ rights organisation. Over 160 participants attended the two-hour workshop in person, which was opened by Rana Sarkar, Consul General of Canada in San Francisco and moderated by Marlena Wisniak from ECNL. Participants from all corners of the world, sectors, and backgrounds joined us for the discussion and showed deep interest in the topic. All the ingredients were there to stir up a vibrant conversation around online and offline harms, and importantly, collectively discuss what companies can do to make their products and platforms safer for users and civil society.

The human rights impacts of gaming platforms largely reflect those in mainstream social media platforms, yet remain underexplored and unaddressed.

This conversation is extremely timely as the gaming industry, and the augmented reality/virtual reality (AR/VR) space more broadly, is rapidly growing. And yet, the impacts of gaming platforms and services on human rights and civic space - both positive and negative - are rarely explored. Indeed, “gaming platforms provide new opportunities for people to come together in community without restrictions of border or physical space,” Lindsey shared. “They also enable players, consumers and creators to create and express themselves, share information and access financial opportunities.” However, gaming platforms also mirror the harms to users and affected communities of mainstream social media platforms. And yet, companies remain often unaware of these potential harms, let alone how to mitigate them. Lindsey warned that “as gaming expands to become more general community-gathering platforms, the spectrum of potential harm will grow as well.” Importantly, the potential for harm falls disproportionately on members of civil society and historically and institutionally marginalised groups. Women and non-binary persons, LGBTQIA+, racialised persons, children and elderly people, people with disabilities, migrants and refugees, and those of lower socio-economic status are particularly at risk.

How can gaming platforms’ activities, services or products harm users and affected communities?

Lindsey shed light on bullying and harassment, particularly for women and the LGBTQIA+ community, as well as minors being exposed to inappropriate content and conduct. She also called out the risk of gaming platforms being used to incite violence against a certain ethnic or religious group in the context of a conflict, or election-related misinformation. Platforms can furthermore be exploited for child sexual exploitation or for recruitment or radicalisation by terrorists and violent extremists.

Jenni asserted that "everyone deserves to feel safe — online and in real life. It sounds sort of naive to say this but obviously it's true. And there are so many things that companies can do to ensure this. Having strong hate speech policies and enforcement is really important. We are also disproportionately impacted by the suppression of legitimate LGBTQIA+ expression in online spaces, so companies need to be mindful and attentive to this problem as well.”

As with social media companies in general, all these harms occur across languages and geographic contexts, increasing the challenge of adequately and fairly moderating online content in a way that is consistent with human rights. This is particularly true given that content policies tend to be both under-enforced and over-enforced, leading to online harms and censorship, respectively. Privacy risks are front of mind, too: beyond data protection of users, issues related to bystander privacy, insider threats, and government requests for data arise, as well.

For extended reality (XR) – an umbrella term that includes AR, VR, mixed reality (MR) and (virtually!) everything in between – many of these benefits, risks, and their associated impacts can be amplified. Lindsey emphasised that “gaming platforms enable new forms of expression that offer more ‘lifelike’ forms of gathering, strong potential for new economic opportunities, and widening opportunities for people with disabilities.” The other side of the coin, however, is more gloomy with a strong spillover effect from online to offline harm. Indeed, Lindsey cautioned about harms related to more sensitive forms of data collection, especially as biometric data is collected and processed for headsets and haptic devices. The “lifelike” form of interaction also means that “online harassment and violence feels more ‘real’ and becomes harder to moderate, leading to physical safety and health risks.”

Gaming platforms must take a (transfeminist) human rights-based approach to their products, centering stakeholder engagement, transparency and marginalised groups.

Like other digital and emerging technologies, gaming platforms need to be designed, developed, and used as consistent with international human rights standards. The United Nations Guiding Principles of Business and Human Rights (UNGPs) and emerging legislation in the EU, such as the Digital Services Act, the AI Act, and the Corporate Sustainability Due Diligence Directive, provide a baseline for corporate responsibility in tech. Human rights impact assessments (HRIAs) play an important role in preventing and addressing adverse impacts of technology.

Fortunately, Lindsey noted, “many companies with trust & safety teams (perhaps unknowingly?) already do some of this. They develop policies for what kind of behavior is allowed or not, build internal content moderation systems and user reporting systems, and have data protection policies and practices, among others. These are all things that address human rights risks.” A key shortcoming, according to Lindsey, is that “the assessment of risk is often missing. Companies tend to be reactive to issues as they arise or as people complain, but sometimes they miss the forest for the trees.” Jenni agreed, stating that “one of the best things companies can do is to proactively establish expectations for user behavior. To state their values strongly with regard to civil discourse. It's so important right now with the tidal wave of real world anti-LGBTQ and especially anti-trans rhetoric and violence and legislative attacks on LGBTQ rights."

Centering product development around marginalised and vulnerable groups improves safety, use, and fun (!) for all. To properly identify and address potential harms, human rights impact assessments must be conducted with meaningful engagement of affected communities, especially those from marginalised groups. Lindsey encouraged gaming platforms “to develop content and behavior moderation processes that match the purpose and ethos of each platform. Think creatively how to implement them, for example through community moderation and building pro-social communities.” Cross-industry and sectoral collaboration can be of tremendous help, with recent organisations such as the Trust & Safety Professional Association (TSPA) and the Digital Trust & Safety Partnership (DTSP) established to further advance safety online. Multistakeholder initiatives such as the Global Network Initiative (GNI) also provide invaluable support to platforms and can enhance civil society voices.

One common theme emerges from all these processes and organisations, and it’s the need for meaningful transparency and stakeholder engagement in the gaming industry. ECNL together with SocietyInside recently released a Framework for Meaningful Engagement in human rights impact assessments for AI. The framework can be directly applicable for gaming platforms, especially for those driven by and/or incorporating AI systems, such as recommender systems or targeted advertising. As gaming platforms move from a ‘niche’ audience to the broader public and a multi-purpose use, it’s critical to center human rights, civic space, and marginalised groups in product design and use. The interest and enthusiasm that participants expressed at the event clearly shows that there’s a need – and hopefully a will – to take this conversation forward.