Safe, Secure, and Trustworthy AI: ECNL at the White House

23-11-2023
At the White House National Security Council Roundtable, we encouraged policymakers to focus on real-world harm of AI.

U.S. President Biden issued an Executive Order (EO) on Safe, Secure, and Trustworthy Artificial Intelligence on October 30, 2023. The EO aims to "establish new standards for AI safety and security, protect Americans’ privacy, advance equity and civil rights, stand up for consumers and workers, promote innovation and competition, advance American leadership around the world, and more."  

Overall, we welcome the legal instrument under the condition that its implementation effectively centers people's rights, especially those of marginalised groups. We hope the EO will be operationalised through rigorous standards developed by the National Institute of Standards and Technology and the U.S. AI Safety Institute with consultation from civil society. Going forward, we hope the U.S. government will advance policy measures that address real-world harm by protecting and promoting human rights, in a way that does not play into techno-solutionist or alarmist narratives related to so-called 'existential risk.' 

On November 14, 2023, ECNL's Marlena Wisniak was invited to participate in a roundtable held by the White House National Security Council. Following the in-person consultation, we shared a written input that includes initial reactions to the EO as well as the following key recommendations: 

  • Develop international AI legal instruments as consistent with international human rights law.
  • Advance legally binding instruments that cover obligations of the private and public sector.  
  • Focus on real-world harm and adopt a sectoral approach to AI governance.   
  • Establish heightened obligations for the use of AI systems by law enforcement, including prohibitions.   
  • Push back against blanket exemptions for national security.  
  • Ensure meaningful civil society participation in AI governance at the national and global level.