Thunderbolt: Mozilla Unveils an Open Source AI Client for Enterprises
In an effort to carve out a place in the AI market, Mozilla has decided to launch Thunderbolt. This is not a new AI model, nor even an agentic web browser. It is a front-end AI client designed for organizations that want to self-host their own AI infrastructure without relying on third-party cloud services.
Thunderbolt: Mozilla's sovereign AI client
Thunderbolt has nothing to do with Thunderbird, Mozilla's email client, or Intel's Thunderbolt technology (widely used on Mac). It is a separate project developed by MZLA Technologies, Mozilla's subsidiary created in 2020 and already responsible for Thunderbird development.
To build Thunberbolt, Mozilla relied on Haystack, an existing open source AI framework, while the layer provided by this new tool makes it a sovereign AI client. For users, the advantage is being able to interface with other services and agents thanks to support for ACP and MCP. As a result, Thunderbolt can connect to enterprise systems and data, notably for the LLM inference stage.
Mozilla mentions the possibility of routing requests to the LLM of your choice, whether Claude, GPT, Mistral, or the OpenRouter service. Of course, connecting to a local AI is also possible.
The application is designed to be self-hosted on your own server and works without network connectivity (unless there are external calls to an LLM). It relies on a locally hosted SQLite database that is also used as a source of truth. "The entire server stack (backend, PostgreSQL, PowerSync, Keycloak) runs via Docker Compose.", you can read on GitHub.
GitHub? Yes, Thunderbolt is an open source project distributed under the MPL-2.0 license. All the code is therefore available in this GitHub repository maintained by the Thunderbird account.

What features does it offer?
Thunderbolt is designed to integrate with locally stored enterprise data via open protocols and can be paired with a model running locally. In terms of use cases, the client covers the now standard AI tools such as chat, search, and automation.
"We recommend using Thunderbolt with Ollama or llama.cpp if you want to benefit from free local inference; you can also add API keys for any model provider compatible with OpenAI in the settings.", it says.
That said, this project is still in active development, with a security audit on the way. Other features are planned, such as end-to-end encryption (preview), agent memory, and tasks (preview). The roadmap is available on this page.
Native apps are available for direct download on Windows, Mac, Linux, iOS, Android, and there is also web access.

Mozilla wants to make Thunderbolt a product ready for enterprise use. Mozilla's work in the field of AI is particularly interesting because it clearly follows an open source, self-hostable approach where the user remains in control.


