Table of Contents

Integrating LLMs and AI into AutoMagic

AutoMagic currently supports Ollama, Google's Gemini and OpenAI large language models.

The AI configuration settings

Select Settings, Configuration, AI and LLMs from the main menu. This will open the AI settings. AutoMagic supports multiple different AI models. They can all be configured in the settings and the user can choose which AI model best fits the task when the AI is invoked.

Users will require permission to access the AI. There are two types of permission.

  1. LLM The allows access to all of the AI models in the configuration.
  2. LLM_FREE Only allows access to models that are marked as free to use. There is a tick box on each AI configuration that denots weather an AI service is free or paid for.

Setting up Google's Gemini

From the The AI configuration settings.

You will need an API key from The Google AI studio.

  1. Set the AI model to Gemini
  2. Set the API key.
  3. Set the name of the model you wish to use, the available models are listed in the AI studio. For example gemini-2.0-flash
  4. Set the temperature to 0 (Zero)

OpenAI and ChatGPT

You will need an API key from here, Get an API key here

  1. Set the AI model to OpenAI
  2. Set the API key.
  3. Set the name of the model you wish to use, the available models are listed in the AI studio. For example o3-mini
  4. Set the temperature to 0 (Zero)

Ollama

First you will need a working setup of Ollama and download one of the available models. You will need a model that supports Tooling, a list of supported models is available here.

Instructions for installing Ollama are available here

  1. Set the AI model to Ollama
  2. Set the Model name to the name of the model you are using. You can use ollama list to get a list of the models you have installed.
  3. Set the URL, if you are running Ollama on your local computer the URL will look something like http://localhost:11434/
  4. Set the temperature to 0 (Zero).

Whats does the integration do?

The integration between AutoMagic and the supported language models allows the AI to extract data from the AutoMagic Database.

AutoMagic has various prompt end points that can interact with the AI. Currently most common use cases are:

  1. Giving the AI some data.
  2. Asking it to extract or reason about the data input.
  3. Respond in a structured way to allow the software to use the response.

For example, we could feed the AI a PDF or text order from a customer and ask it to extract the products and quantities the customer requires. The response can then be imported into AutoMagic as a New Sale or Stock Order.

What data does AutoMagic share with the AI?

Data is only share with the AI while you are accessing the prompt. If you are not using the AI prompts directly, no data is shared.

Once we hand over data it will be up to the AI what it does with your data and for how long that data is retained.

AutoMagic has various tooling that the AI will can use to extract data from the AutoMagic database. The tools include, but are not limited to:

  1. Searching the stock list, item descriptions and other data about the stock, including prices.
  2. Searching the customer address book. There are operations where we need the AI to resolve a Contact ID from the address book, this requires sharing information about the customers and suppliers with the AI.
  3. Searching previous sales and purchase orders. These tools allow the AI to understand a customer or suppliers ordering history.

To allow the AI to reason about more complex tasks we share any necessary data about the task. This helps the AI to formulate a better response.