Ollama API provider

From MoodleDocs

The Moodle integration with Ollama provides access to AI actions such as generate text, summarise text and explain text.

Requirements

  1. Download Ollama and install it.
  2. Pull and make available a LLM model from the Ollama library such as llama3.2.

Ollama provider instance set-up

To create a provider instance:

  1. Go to Site administration > General > AI providers.
  2. Click 'Create a new provider instance'.
  3. Select Ollama as AI provider plugin, enter a name and API endpoint then click 'Create instance'.

To configure the Ollama provider instance:

  1. Go to Site administration > General > AI providers.
  2. Click the settings link for Ollama.
  3. Enable/disable actions as required.

For each action, you can customise the model's behaviour.

  1. Click the settings link for an action.
  2. Review and amend the system instruction as necessary.
  3. Enter values for the following settings as required and save changes.
  • Mirostat - Mirostat is a neural text decoding algorithm for controlling perplexity. 0 = disabled, 1 = Mirostat, 2 = Mirostat 2.0. (Default: 0)
  • Temperature - Temperature influences whether the output is more random and creative or more predictable. Increasing the temperature will make the model answer more creatively. (Default: 0.8)
  • Seed - Sets the random number seed to use for generation. Setting this to a specific number will make the model generate the same text for the same prompt. (Default: 0)
  • top_k - Reduces the probability of generating nonsense. A higher value (e.g. 100) will give more diverse answers, while a lower value (e.g. 10) will be more conservative. (Default: 40)
  • top_p - Works together with top-k. A higher value (e.g. 0.95) will lead to more diverse text, while a lower value (e.g. 0.5) will generate more focused and conservative text. (Default: 0.9)

To enable the Ollama provider instance:

  1. Go to Site administration > General > AI providers.
  2. Click the toggle to enable the instance.

See also

  • 4.5 documentation Ollama for using Ollama with the OpenAI provider.