Announcing Mozilla Builders

2024 Accelerator Theme: Local AI

Local AI is the theme for our inaugural Accelerator because it aligns with our core values of privacy, user empowerment, and open source innovation. Local AI refers to running AI models and applications directly on personal devices like laptops and smartphones, rather than relying on cloud-based services controlled by Big Tech companies. This approach fosters greater data privacy and control by putting the technology directly into the hands of users . It also democratizes AI development by reducing costs, making powerful AI tools accessible to individual developers and small communities.

By focusing on Local AI, we aim to cultivate a decentralized, inclusive AI ecosystem where privacy and individual personalization are paramount.

Opportunities for Innovation

Here are some of the key areas where we see opportunities for innovation in Local AI. These areas are not exhaustive, and you should feel free to propose projects that fall outside of these categories.

Developer Productivity

Traditionally, software development was “local first” in nature, meaning that most developers started by writing and testing their software on their local machine, only shifting to a cloud setting later on when it became necessary (for example, to enable others to test or contribute to the software, or to deploy it to end users). AI has recently inverted this workflow: many developers now begin their work in the cloud, building their applications’ AI functionality by calling cloud APIs provided by Big Tech. That’s because those APIs and associated tools are generally quite easy to use and integrate.

In order for Local AI to succeed, developers must have tools that make it just as easy to build their apps entirely locally. Technologies that make it easier to build against locally-running models and integrate locally-running software components will help return us to “local first” development workflows, and in doing so help make Local AI a reality.


Agents represent the future of AI-powered applications, and perhaps even the future of software. They offer the promise of computers that are empowered to take complex actions on behalf of users, altering our expectations of what computers can do for us and how much control non-programmers can exert over them. For Local AI systems to play a role in this future, they will need their own robust agent capabilities. This means agents that can run on local hardware, that can operate in the background for long periods of time, and that can customize their behavior to match the user’s needs while also respecting their privacy and protecting their data security.

Generative UI

As AI becomes woven into more and more of the modern computing experience, there is a growing need for user interfaces that can be just as adaptive as the AIs themselves. If AI can enable a computer to customize its behavior to a specific user’s needs, then it must be able to customize its input and output, too. If this is only possible at the operating system level, then open source AI applications will be at a disadvantage. If Local AI systems can dynamically generate user interfaces then that will enable a rich and self-feeding ecosystem of design, experimentation, and learning.


Fine-tuning is a process by which large language models can be “re-trained” with additional data in ways that change or customize their knowledge and behavior. Today, fine-tuning is a key piece of the open source AI landscape, but can be time- and resource-intensive. This restricts who can do it, and the uses it can be applied to. Adapting fine-tuning techniques to be viable on local devices enables entirely new uses, such as models that continually learn and adapt themselves to their user.

Retrieval-Augmented Generation (RAG)

RAG applications enhance AI's capability to provide contextually accurate responses by integrating real-time data retrieval with generative processing. This capability is particularly interesting for Local AI because it makes smaller, less resource-hungry models more useful by enabling them to answer questions about data they weren’t trained on. Anything that improves the efficiency, performance, or quality of RAG techniques can have a major impact on the usefulness of Local AI applications.

Function Calling

The next generation of AI hinges on the ability to dynamically execute complex functions. This empowers AI applications to perform sophisticated tasks on behalf of the user, from automatically operating software components, to retrieving data from the Internet, and much more. Today’s leading commercial AI offerings have strong function calling capabilities that most open models lack. In order for Local AI applications to be functionally competitive with cloud-based offerings, these capabilities (and open source components that make effective use of them) need to be further developed.


Evaluation of AI-driven systems is an emerging field and discipline. It is key to understanding if they are behaving as designed, if they are sufficiently useful, and if they are safe to put in the hands of users. Open source developers need customizable evaluation tools to support the development of local AI applications.

Join Us

The Mozilla Builders Accelerator is an opportunity to explore the potential of Local AI to empower users, protect privacy, and foster a more inclusive AI ecosystem. We invite developers, researchers, and technologists to join us in this journey to create a future where AI is decentralized, accessible, and user-centric.

Apply or learn more at Mozilla Builders