5 minute read | September.13.2024
This update is part of our EU AI Act Series. Learn more about the EU AI Act here.
The EU AI Act has been formally adopted and the countdown to the start of enforcement has begun.
Here are six steps you can take to jump start your compliance efforts:
Addressing the AI Act’s obligations requires a combination of legal, technical and operational expertise.
Organizations that are most successful in addressing AI obligations and risks typically designate a diverse, cross-discipline group responsible for the company’s AI governance and compliance. Team members are often pulled from legal, compliance, information technology, engineering and product departments. This core team is then supported by AI champions within departments across the organization who act as the first point of contact for new policies, processes and tools.
To successfully navigate the evolving AI legal environment, organizations should develop and implement internal policies and procedures to align their use and development of AI with their mission, risk profile, legal obligations and ethical priorities.
While the AI Act may dictate some of the components of this AI governance framework, the most successful companies will build a foundational program upon which specific compliance obligations can attach. This approach allows for greater flexibility in adapting to new laws and technologies and helps to integrate AI governance into related compliance and operational frameworks, such as those related to data protection and cybersecurity.
Most organizations start by developing an AI Responsible Use Policy to guide their initial AI-related compliance efforts.
Starting February 2, 2025, the AI Act will require covered organizations to train their employees and other AI-related workers to have a sufficient level of AI knowledge and competence to ensure appropriate development and use of AI and compliance with applicable law.
Organizations should consider implementing a standard, organization-wide training that provides all employees and relevant workers a baseline knowledge of the technology and related compliance requirements, as well as more specific role-based training where an individual has a greater involvement with AI development or use.
The EU AI Act only regulates certain covered AI-related technologies.
As a first step in assessing the Act’s applicability and corresponding obligations, organizations should inventory AI they are developing, using, distributing, or selling and document:
Organizations should consider adopting a department-by-department approach to the inventory process, as this will often help identify less visible AI use cases.
After the organization’s AI inventory is completed, it can be used to help determine whether the organization’s AI qualifies as an in-scope technology and, if it does:
Given the Prohibited AI prohibitions take effect on February 2, 2025, we recommend companies prioritize determining whether any of these prohibitions may apply. The other obligations and restrictions under the AI Act have a later effective date (see here for more details).
As shown in our compliance timeline, the effective dates for the AI Act’s obligations are spread out over several years. Nonetheless, the AI Act is already influencing legislation in other countries (see our U.S. AI Law Tracker), with some laws jumping ahead of the AI Act in terms of implementation. In addition, compliance measures for many of the Act’s obligations and restrictions can require meaningful lead time prior to implementation.
As a result, organizations should consider:
Check back in January for next steps to implement in 2025.
Want to know more? Reach out to a member of our team.