The EU AI Act: 10 Things Startups Should Know


5 minute read | October.16.2024

This update is part of our EU AI Act Series. Learn more about the EU AI Act here.

Like other companies, startups that develop, use or commercialize AI systems or general-purpose AI models (GPAIMs) that people in the EU use must comply with the AI Act. Here are 10 things startups should know.

1. Do startups have to comply with the AI Act?

Yes, they must comply if they develop, use or commercialize an AI system or a GPAIM in the EU. This includes selling products in the EU and using internal systems for EU-based operations. The compliance obligations vary depending on a company’s role and the level of risk associated with a given AI system or GPAIM.

2. What action should startups take now?

Companies of all sizes should map the AI systems and models they develop and use to determine if and how the AI Act will affect them. They should prioritize compliance based on risk and document compliance activities. Startups may want to consider engaging counsel with expertise in the AI Act to ensure they understand and meet their obligations.

3. What about startups based outside the EU?

If a company outside the EU places an AI system or a GPAIM on the market in the EU, the Act can apply. This is also the case if outputs of an AI system are used in the EU. There are limited exceptions. Startups should consult with counsel to determine whether any exceptions apply to them.

4. What are the deadlines to comply?

The deadline to comply depends on a variety of factors, including when a company developed an AI system or a GPAIM and the level of risk the AI Act assigns to a given system or GPAIM.

Many compliance obligations take effect on August 1, 2026.

Check out this overview of key dates for operators of AI systems and GPAIM providers.

5. What compliance costs will startups incur?

The compliance costs startups face will depend on the nature of the AI system or GPAIM they develop. Compliance costs should be factored in by businesses that develop GPAIMs and AI systems, particularly AI systems considered high risk, such as those focused on emotional recognition, employee recruitment or selection, and critical infrastructure components.

Examples of potential compliance costs may include:

  • Establishing a quality management system that complies with the Act.
  • Implementing human oversight.
  • Preparing technical documentation.
  • Hiring an independent auditor to audit high-risk AI systems.
  • Ensuring adequate cybersecurity.

6. Do startups face the same obligations as more established companies?

The European Commission has tried to ease the compliance burden on startups and other small and medium-sized enterprises.

For example, the AI Act requires companies that provide high-risk AI systems to set up a compliant quality management system. This may be one of the more costly compliance obligations. The Act seeks to ease the burden on startups and other small and mid-sized businesses by:

  • Tailoring quality management system requirements to a company’s size.
  • Letting startups meet the obligations in a simplified manner.
  • Calling on the European Commission to clarify what elements of a quality management system startups must implement.

7. Does the AI Act make any other allowances for startups?

  • Once startups implement a quality management system, they must prepare technical documentation for AI products the AI Act deems high risk.
  • Startups developing or using high-risk AI systems also must pay an independent third party to audit their systems, including the quality management system and technical documentation.
    • Once small and medium-sized enterprises begin using the simplified form of technical documentation, third-party reviewers must accept the form when reviewing a company’s compliance.
    • Fees for the assessments will be reduced proportionately based on a company’s size.
  • In January 2024, the European Commission launched a program to help startups and small and mid-sized enterprises develop AI systems that align with EU requirements.

8. What developments should startups approach with caution?

The AI Act prohibits placing AI systems that pose unacceptable risks on the market in the EU, such as those that involve:

  • Social scoring that leads to detrimental or unfavorable treatment.
  • Manipulative or deceptive techniques that impair decision-making and cause or are likely to cause harm.
  • The exploitation of vulnerable groups, for instance due to age or disability.
  • Certain biometric categorization systems.
  • Facial recognition databases.
  • Profiling to establish the risk of criminality.
  • Inferring emotions in workplaces or educational institutions.
  • Real-time remote biometric identification for law enforcement unless for an exempted purpose.
  • Biometric categorization systems for the purpose of deducing race, political opinions, sexual orientation or other sensitive information.

The AI Act goes into greater detail about what is prohibited, including information about exemptions.

9. What else should startups know?

  • The European Commission is developing codes of practice related to a number of obligations under the AI Act, including related to GPAIMs.Once published, the codes will provide much-needed clarity for small and medium-sized companies on how to comply with the AI Act.
  • The European Commission has encouraged startups and others to help draft these codes. A new AI Office plans to publish updates.
  • Startups that use AI systems in their operations must train staff on AI and be transparent about the use of generative AI technologies. For low-risk uses, startups may consider using AI tools and off-the-shelf compliance solutions to meet some of these requirements.

10. What if a startup does not comply with the AI Act?

  • Fines for noncompliance with prohibited AI practices may be up to 35 million euros or 7% of global annual revenue.
  • Noncompliance regarding high-risk AI systems and violations of transparency obligations may result in fines of up to 15 million euros or 3% percent of global annual revenue.
  • For small and medium-sized enterprises, including startups, each of these fines will be capped at the lower of the dollar amount and the maximum percent.
  • Additionally, GPAIM developers may be subject to fines up to 3% of global annual revenue or 15 million euros, whichever is higher.
  • The AI Act says fines should be proportionate, and consider a provider’s size, in particular if the provider is a small enterprise, including a startup.

Want to know more? Reach out to one of the authors or other members of Orrick’s AI team.

The EU AI Act Series: Key Takeaways for Companies Using and Developing AI
AI Resource Center