Integrating LLMs and AI into AutoMagic
AutoMagic currently supports Ollama, Google's Gemini and OpenAI large language models.
The AI configuration settings
Select Settings, Configuration, AI and LLMs
from the main menu.
This will open the AI settings. AutoMagic supports multiple different AI models.
They can all be configured in the settings and the user can choose which AI model best fits the task when the AI is invoked.
Users will require permission to access the AI. There are two types of permission.
LLM
The allows access to all of the AI models in the configuration.LLM_FREE
Only allows access to models that are marked as free to use. There is a tick box on each AI configuration that denots weather an AI service is free or paid for.
Setting up Google's Gemini
From the The AI configuration settings.
You will need an API key from The Google AI studio.
- Set the AI model to
Gemini
- Set the API key.
- Set the name of the model you wish to use, the available models are listed in the AI studio. For example
gemini-2.0-flash
- Set the temperature to
0
(Zero)
OpenAI and ChatGPT
You will need an API key from here, Get an API key here
- Set the AI model to
OpenAI
- Set the API key.
- Set the name of the model you wish to use, the available models are listed in the AI studio. For example
o3-mini
- Set the temperature to
0
(Zero)
Ollama
First you will need a working setup of Ollama and download one of the available models. You will need a model that supports Tooling, a list of supported models is available here.
Instructions for installing Ollama are available here
- Set the AI model to
Ollama
- Set the Model name to the name of the model you are using. You can use
ollama list
to get a list of the models you have installed. - Set the
URL
, if you are running Ollama on your local computer the URL will look something likehttp://localhost:11434/
- Set the temperature to
0
(Zero).
Whats does the integration do?
The integration between AutoMagic and the supported language models allows the AI to extract data from the AutoMagic Database.
AutoMagic has various prompt end points that can interact with the AI. Currently most common use cases are:
- Giving the AI some data.
- Asking it to extract or reason about the data input.
- Respond in a structured way to allow the software to use the response.
For example, we could feed the AI a PDF or text order from a customer and ask it to extract the products and quantities the customer requires. The response can then be imported into AutoMagic as a New Sale
or Stock Order
.
What data does AutoMagic share with the AI?
Data is only share with the AI while you are accessing the prompt
. If you are not using the AI prompts directly, no data is shared.
Once we hand over data it will be up to the AI what it does with your data and for how long that data is retained.
AutoMagic has various tooling that the AI will can use to extract data from the AutoMagic database. The tools include, but are not limited to:
- Searching the stock list, item descriptions and other data about the stock, including prices.
- Searching the customer address book. There are operations where we need the AI to resolve a Contact ID from the address book, this requires sharing information about the customers and suppliers with the AI.
- Searching previous sales and purchase orders. These tools allow the AI to understand a customer or suppliers ordering history.
To allow the AI to reason about more complex tasks we share any necessary data about the task. This helps the AI to formulate a better response.