AI Service Configuration Reference

The AI component drives all LLM‑based ticket‑fix, summarisation, and knowledge‑base queries. The service supports a variety of cloud‑hosted and on‑premises providers. Only one LLM (Large Language Model) provider can be active at a time; the chosen provider determines which of the remaining fields are required. You can obtain an DevLens API key by creating one in the automatically created DevLens project in the web UI.


LLM Provider Constants

Constant Meaning Configuration Value
ollama Self‑hosted Ollama server DEVLENS_AI_LLM=ollama
gemini Google Gemini DEVLENS_AI_LLM=gemini
openai OpenAI API DEVLENS_AI_LLM=openai
openrouter OpenRouter API DEVLENS_AI_LLM=openrouter
bedrock Amazon Bedrock DEVLENS_AI_LLM=bedrock

AI Configuration Fields

TOML key Environment variable Type Default Example Notes
devlens_host DEVLENS_AI_DEVLENS_HOST string https://devlens.company.com URL of the DevLens core API.
devlens_api_key DEVLENS_AI_DEVLENS_API_KEY string s3cr3t API key for authenticating the AI service to the core.
llm DEVLENS_AI_LLM string openai One of the constants above.
ollama_url DEVLENS_AI_OLLAMA_URL string http://localhost:11434 Required when llm=ollama.
google_api_key DEVLENS_AI_GOOGLE_API_KEY string AIzaSy... Required when llm=gemini.
openai_api_key DEVLENS_AI_OPENAI_API_KEY string sk-... Required when llm=openai.
openai_model DEVLENS_AI_OPENAI_MODEL string gpt-4o-mini Optional; defaults to the model the DevLens core expects.
openrouter_api_key DEVLENS_AI_OPENROUTER_API_KEY string sk-... Required when llm=openrouter.
openrouter_model DEVLENS_AI_OPENROUTER_MODEL string gpt-4o-mini Optional; defaults to the model configured in the core.
bedrock_region DEVLENS_AI_BEDROCK_REGION string us-east-1 Required when llm=bedrock.
bedrock_model DEVLENS_AI_BEDROCK_MODEL string anthropic.claude-3-haiku-20240307 Required when llm=bedrock.

Mandatory combinations * ollamaollama_url * geminigoogle_api_key * openaiopenai_api_key (and optionally openai_model) * openrouteropenrouter_api_key (and optionally openrouter_model) * bedrockbedrock_region + bedrock_model


Example ai.toml

devlens_host          = "https://api.devlens.com"
devlens_api_key       = "s3cr3t"
llm               = "openai"
openai_api_key    = "sk-XXXXXXXXXXXXXXXXXXXX"
openai_model      = "gpt-4o-mini"