Skip to main content

Connecting an LLM Provider

RepoFlow allows you to integrate an external Large Language Model (LLM) provider to enable AI-powered features.

Configuration Options

The environment variables required to configure an LLM provider are listed in the AI Environment Variables section.

Setting Up an LLM Provider

To enable AI features in a self-hosted RepoFlow instance, follow these steps:

  1. Choose an LLM Provider
    Select from supported providers (openai, ollama, anthropic, mistral, cohere, huggingface, azure, bedrock).

  2. Configure Environment Variables
    Set the appropriate values for the environment variables in your configuration file.

  3. Ensure Your License Includes AI Features
    AI functionality requires a valid license that includes AI support. If you are unsure whether your license includes AI features, please contact us at hello@repoflow.io.

  4. Restart RepoFlow
    Restart the RepoFlow server to apply the changes.

Once configured, your RepoFlow instance will utilize the selected LLM provider to enable AI-related features.

For more details on AI capabilities, check AI Features.

Examples

General Environment Setup

For a basic configuration, you can start with the following environment variables. More details are available in the AI Environment Variables section.

ENABLE_AI_FEATURES=true
LLM_MAX_CONTEXT_SIZE=8000

Ollama Setup

Update the LLM_MODEL_NAME to match a model available in your Ollama setup.

LLM_PROVIDER=ollama
LLM_SERVER_URL=http://127.0.0.1:11434
LLM_MODEL_NAME="llama3.2:3b"

Hugging Face Setup

Replace <your_huggingface_api_key> with your actual Hugging Face API key.

LLM_PROVIDER=huggingface
LLM_SERVER_URL=https://router.huggingface.co
LLM_MODEL_NAME=HuggingFaceTB/SmolLM3-3B
LLM_API_KEY=<your_huggingface_api_key>

AWS Bedrock Setup

Replace the placeholders below with your actual AWS credentials and region.

LLM_PROVIDER=bedrock
LLM_MODEL_NAME="us.anthropic.claude-3-5-haiku-20241022-v1:0"
LLM_ACCESS_KEY_ID=<your_aws_access_key_id>
LLM_SECRET_ACCESS_KEY=<your_aws_secret_access_key>
LLM_REGION=<your_aws_region> # e.g. us-east-1