/
Configure Ollama for AI Review Assistant

Configure Ollama for AI Review Assistant

This page describes how to configure Ollama as your AI provider for AI Review Assistant.

As a repository administrator, follow these steps:

  1. Go the Code Review Assistant repository settings

  2. Make sure you are in the AI Review Assistant tab on top

  3. In the API URL section, enter the REST API address for Claude AI: http://<ollama-server-host>:11434/api/chat, replacing <ollama-server-host> accordingly.

  4. Leave the Headers section empty.

  5. In the Request Body section, click on the ➕ icon. Enter model in the input, then click on Add > Text and enter phi4:latest. Make sure to review available models in the official Ollama documentation.

  6. Still in the Request Body section, click on the ➕ icon. Enter stream in the input, then click on Add > Boolean and leave it disabled. The body configuration should look like the following

    image-20250120-174025.png
  7. In the API Key section, you may leave empty.

  8. In the Response Mapping JQL section, enter message.content.

  9. In the Response Error Mapping JQL section, enter error.message.

  10. Click on the “Test” button. Check that you get the validation that everything is configured properly.

  11. Click “Save” to save your configuration.

 

If you have trouble in configuring our AI Review Assistant, we are happy to help! 🎉