Connecting an LLM Provider
RepoFlow allows you to integrate an external Large Language Model (LLM) provider to enable AI-powered features.
Configuration Options
The environment variables required to configure an LLM provider are listed in the AI Environment Variables section.
Setting Up an LLM Provider
To enable AI features in a self-hosted RepoFlow instance, follow these steps:
-
Choose an LLM Provider:
Select from supported providers (openai
,ollama
,anthropic
,mistral
,cohere
,huggingface
,azure
,bedrock
). -
Configure Environment Variables:
Set the appropriate values for the environment variables in your configuration file. -
Ensure Your License Includes AI Features:
AI functionality requires a valid license that includes AI support. If you are unsure whether your license includes AI features, please contact us at hello@repoflow.io. -
Restart RepoFlow:
Restart the RepoFlow server to apply the changes.
Once configured, your RepoFlow instance will utilize the selected LLM provider to enable AI-related features.
For more details on AI capabilities, check AI Features.