Ollama API provider
From MoodleDocs
The Moodle integration with Ollama provides access to AI actions such as generate text, summarise text and explain text.
Requirements
- Download Ollama and install it.
- Pull and make available a LLM model from the Ollama library such as llama3.2.
Ollama provider instance set-up
To create a provider instance:
- Go to Site administration > General > AI providers.
- Click 'Create a new provider instance'.
- Select Ollama as AI provider plugin, enter a name and API endpoint then click 'Create instance'.
To configure the Ollama provider instance:
- Go to Site administration > General > AI providers.
- Click the settings link for Ollama.
- Enable/disable actions as required.
For each action, you can customise the model's behaviour.
- Click the settings link for an action.
- Review and amend the system instruction as necessary.
- Enter values for the following settings as required and save changes.
- Mirostat - Mirostat is a neural text decoding algorithm for controlling perplexity. 0 = disabled, 1 = Mirostat, 2 = Mirostat 2.0. (Default: 0)
- Temperature - Temperature influences whether the output is more random and creative or more predictable. Increasing the temperature will make the model answer more creatively. (Default: 0.8)
- Seed - Sets the random number seed to use for generation. Setting this to a specific number will make the model generate the same text for the same prompt. (Default: 0)
- top_k - Reduces the probability of generating nonsense. A higher value (e.g. 100) will give more diverse answers, while a lower value (e.g. 10) will be more conservative. (Default: 40)
- top_p - Works together with top-k. A higher value (e.g. 0.95) will lead to more diverse text, while a lower value (e.g. 0.5) will generate more focused and conservative text. (Default: 0.9)
To enable the Ollama provider instance:
- Go to Site administration > General > AI providers.
- Click the toggle to enable the instance.
See also
- 4.5 documentation Ollama for using Ollama with the OpenAI provider.