Thunderbird add-on designed to enhance both professional and personal email management.
This add-on integrates a range of AI (LLM) features to streamline your inbox experience.
Our aim is to assist users dealing with high volumes of daily emails, providing tools for tasks like summarizing messages, translating content, offering structured support for composing responses and much more.
Have you ever had an inbox full of hundreds of unread emails that you need to respond?
We have, and more than once.
That's why we decided to create this add-on for Thunderbird to help manage the multitude of emails we read daily as part of our work activities.
Several LLMs (Large Language Models) are integrated to provide a range of options for advanced text management, operating at the deepest possible semantic level, to optimize the management of your email inbox.
The LLMs* currently supported are:
- Claude by Anthropic;
- DeepSeek by Hangzhou DeepSeek Artificial Intelligence Basic Technology Research;
- Gemini by Google;
- GPT by OpenAI;
- Grok by xAI;
- Mistral by Mistral AI.
It is possible to access a wider set of models (e.g., Llama, Phi, Mistral, Gemma, and many others) through the use of:
* To use them, it is necessary to create an account on the respective platforms and enable an API access key. Usage fees apply; for more details, please refer to the respective websites.
ATTENTION 1: The services offered by Groq Cloud and Mistral AI, include the option to use a free plan, albeit with low rate limits on requests.
ATTENTION 2: Unlike other LLM models, LM Studio and Ollama, allow you to run open-source models directly on your own PC, with no additional costs and maximum privacy, as everything is executed locally.
The downside is that this requires SIGNIFICANT hardware resources.
After installing the add-on, it will be possible to configure the desired LLM service provider from the related settings.
You can access the add-on settings by going to Tools → Add-ons and Themes
, and then selecting the wrench icon next to AI Mail Support.
Based on the LLM choice, additional specific options will then be available, for example, below is the screenshot of all possible configurations when OpenAI is specifically selected as the provider.
Typically, an authentication key needs to be configured, the specific method depends on the LLM provider.
In the options, there will be a quick link to the official website with useful details.
Once the add-on is configured, it will be possible to interact with the AI management features within Thunderbird in three different locations:
- In the email view, "AI support" menu:
- In the email composition or editing window, by selecting "AI support" in the top right:
- By selecting any text in either the email viewing or composition window, in the "AI Mail Support" section:
Regardless of how a request for processing is made, the output (audio or text) will be displayed in a dedicated pop-up at the bottom of the mail client.
If you use the Owl for Exchange add-on to manage Exchange or Office365 accounts,
Run the following to build the add-on directly from the source code:
$ git clone https://github.com/YellowSakura/aimailsupport.git
$ cd aimailsupport
$ npm install
To compile a development version of the add-on and install it in Thunderbird via Tools → Developer Tools → Debug Add-ons → Load Temporary Add-on…
use the following command:
$ npm run build
To generate a file named ai-mail-support.xpi in the project's root folder, as a package ready for installation as an add-on in Thunderbird, use the following command:
$ npm run build:package
To assess the overall quality of code, you can use the following command:
$ npm run quality
It is possible to run unit tests using the command:
$ npm run test
However, it is necessary to prepare an .env
file beforehand, with the keys for the various LLM services in the following format:
anthropic_api_key = KEY_VALUE
deepseek_api_key = KEY VALUE
google_api_key = KEY VALUE
groq_api_key = KEY VALUE
mistral_api_key = KEY VALUE
openai_api_key = KEY_VALUE
xai_api_key = KEY_VALUE
To test LM studio, it is necessary to install the model llama-3.2-1b
from the GUI or using the command:
$ lms get llama-3.2-1b
To test Ollama, it is necessary to install the model llama3.2:1b
using the command:
$ ollama pull llama3.2:1b
AI Mail Support for Thunderbird aims to make use of a minimal set of permissions for its operation, specifically:
- accountsRead: See your mail accounts, their identities and their folders.
Used to identify the presence of any accounts managed by the add-on Owl for Exchange and display a malfunction warning as indicated in the Owl for Exchange bug section, see https://webextension-api.thunderbird.net/en/latest/accounts.html#permissions. - compose: Read and modify your email messages as you compose and send them.
Used to interact with the email composition window (replying or creating a new email), see https://webextension-api.thunderbird.net/en/latest/compose.html#permissions. - menus: Required to use
messenger.menus.*
functions.
Used to create custom menus, see https://webextension-api.thunderbird.net/en/latest/menus.html#permissions. - messagesRead: Read your email messages.
Used to read the content of an existing email in the viewing window, see https://webextension-api.thunderbird.net/en/latest/messages.html#permissions. - messagesModify: Read and modify your email messages as they are displayed to you.
Used to modify the content of an existing email in the viewing window, see https://webextension-api.thunderbird.net/en/latest/messageDisplayScripts.html#permissions. - sensitiveDataUpload: The contents of the emails are sent (based on the choices made in the options) to the different LLM service providers, who will then be able to process them.
- storage: Enables add-on to store and retrieve data, and listen for changes to stored items.
Used to store user settings, see https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/API/storage.
The add-on uses a very small set of messages that require localization, so if you want to extend the translation, it's really simple, here's what you need to do:
-
Copy the file
src/locales/en-messages.json
tosrc/locales/%ISO CODE%-messages.json
where%ISO CODE%
is your ISO-639-1 language code; -
Translate your
src/locales/%ISO CODE%-messages.json
, specifically the messagefields
, and remove thedescriptions
used to provide additional context; -
Add a new line in the
package.json
file, in line with the otherbuild:locales-*
entries, in alphabetical order compared to the current ones, in the form:"build:locales-%ISO CODE%": "node_modules/.bin/json-minify src/locales/%ISO CODE%-messages.json > ai-mail-support/_locales/%ISO CODE%/messages.json",
-
Add a new configuration again in the
package.json
, in thebuild:locales
section, maintaining alphabetical order once again; -
Add the folder
%ISO CODE%
to_locales
; -
Test using the build process as described in the Getting started section and release the modification as a pull request.
The code is licensed under the MIT by Yellow Sakura, [email protected], see LICENSE file.
For more details, please refer to the project page and the link to the official AMO (addons.mozilla.org) page.
Dependencies:
- ESLint is licensed under MIT License;
- parcel is licensed under MIT License;
- posthtml is licensed under MIT License;
- sanitize-html is licensed under MIT License;
- types/sanitize-html is licensed under MIT License;
- types/thunderbird-webext-browser is licensed under MIT License;
- typescript-eslint/parser is licensed under BSD 2-clause license.
Images:
- Robots logo icons in
docs/bot-icon-*
were created by Smashicons - Freepik
All trademarks mentioned are the property of their respective owners.