Configure Ollama for AI Review Assistant
This page describes how to configure Ollama as your AI provider for AI Review Assistant.
As a repository administrator, follow these steps:
Go the Code Review Assistant repository settings
Make sure you are in the AI Review Assistant tab on top
In the API URL section, enter the REST API address for Claude AI:
http://<ollama-server-host>:11434/api/chat
, replacing<ollama-server-host>
accordingly.Leave the Headers section empty.
In the Request Body section, click on the ➕ icon. Enter
model
in the input, then click onAdd > Text
and enterphi4:latest
. Make sure to review available models in the official Ollama documentation.Still in the Request Body section, click on the ➕ icon. Enter
stream
in the input, then click onAdd > Boolean
and leave it disabled. The body configuration should look like the followingIn the API Key section, you may leave empty.
In the Response Mapping JQL section, enter
message.content
.In the Response Error Mapping JQL section, enter
error.message
.Click on the “Test” button. Check that you get the validation that everything is configured properly.
Click “Save” to save your configuration.
If you have trouble in configuring our AI Review Assistant, we are happy to help! 🎉