The configuration system in CodeGate is managed through the Config
class in
config.py
. It supports multiple configuration sources with a clear priority
order.
Configuration sources are evaluated in the following order, from highest to lowest priority:
- CLI arguments
- Environment variables
- Configuration file (YAML)
- Default values (including default prompts from
prompts/default.yaml
)
Values from higher-priority sources take precedence over lower-priority values.
- Port:
8989
- Proxy port:
8990
- Host:
"localhost"
- Log level:
"INFO"
- Log format:
"JSON"
- Prompts: default prompts from
prompts/default.yaml
- Provider URLs:
- vLLM:
"http://localhost:8000"
- OpenAI:
"https://api.openai.com/v1"
- Anthropic:
"https://api.anthropic.com/v1"
- Ollama:
"http://localhost:11434"
- vLLM:
- Certificate configuration:
- Certs directory:
"./certs"
- CA certificate:
"ca.crt"
- CA private key:
"ca.key"
- Server certificate:
"server.crt"
- Server private key:
"server.key"
- Certs directory:
Load configuration from a YAML file:
config = Config.from_file("config.yaml")
Example config.yaml:
port: 8989
proxy_port: 8990
host: localhost
log_level: INFO
log_format: JSON
provider_urls:
vllm: "https://vllm.example.com"
openai: "https://api.openai.com/v1"
anthropic: "https://api.anthropic.com/v1"
ollama: "http://localhost:11434"
certs_dir: "./certs"
ca_cert: "ca.crt"
ca_key: "ca.key"
server_cert: "server.crt"
server_key: "server.key"
Environment variables are automatically loaded with these mappings:
CODEGATE_APP_PORT
: server portCODEGATE_APP_PROXY_PORT
: server proxy portCODEGATE_APP_HOST
: server hostCODEGATE_APP_LOG_LEVEL
: logging levelCODEGATE_LOG_FORMAT
: log formatCODEGATE_PROMPTS_FILE
: path to prompts YAML fileCODEGATE_PROVIDER_VLLM_URL
: vLLM provider URLCODEGATE_PROVIDER_OPENAI_URL
: OpenAI provider URLCODEGATE_PROVIDER_ANTHROPIC_URL
: Anthropic provider URLCODEGATE_PROVIDER_OLLAMA_URL
: Ollama provider URLCODEGATE_CERTS_DIR
: directory for certificate filesCODEGATE_CA_CERT
: CA certificate file nameCODEGATE_CA_KEY
: CA key file nameCODEGATE_SERVER_CERT
: server certificate file nameCODEGATE_SERVER_KEY
: server key file name
config = Config.from_env()
Network settings can be configured in several ways:
-
Configuration file:
port: 8989 # Port to listen on (1-65535) proxy_port: 8990 # Proxy port to listen on (1-65535) host: "localhost" # Host to bind to
-
Environment variables:
export CODEGATE_APP_PORT=8989 export CODEGATE_APP_PROXY_PORT=8990 export CODEGATE_APP_HOST=localhost
-
CLI flags:
codegate serve --port 8989 --proxy-port 8990 --host localhost
Provider URLs can be configured in several ways:
-
Configuration file:
provider_urls: vllm: "https://vllm.example.com" # /v1 path is added automatically openai: "https://api.openai.com/v1" anthropic: "https://api.anthropic.com/v1" ollama: "http://localhost:11434" # /api path is added automatically
-
Environment variables:
export CODEGATE_PROVIDER_VLLM_URL=https://vllm.example.com export CODEGATE_PROVIDER_OPENAI_URL=https://api.openai.com/v1 export CODEGATE_PROVIDER_ANTHROPIC_URL=https://api.anthropic.com/v1 export CODEGATE_PROVIDER_OLLAMA_URL=http://localhost:11434
-
CLI flags:
codegate serve --vllm-url https://vllm.example.com --ollama-url http://localhost:11434
Note:
- For the vLLM provider, the
/v1
path is automatically appended to the base URL if not present. - For the Ollama provider, the
/api
path is automatically appended to the base URL if not present.
Certificate files can be configured in several ways:
-
Configuration file:
certs_dir: "./certs" ca_cert: "ca.crt" ca_key: "ca.key" server_cert: "server.crt" server_key: "server.key"
-
Environment variables:
export CODEGATE_CERTS_DIR=./certs export CODEGATE_CA_CERT=ca.crt export CODEGATE_CA_KEY=ca.key export CODEGATE_SERVER_CERT=server.crt export CODEGATE_SERVER_KEY=server.key
-
CLI flags:
codegate serve --certs-dir ./certs --ca-cert ca.crt --ca-key ca.key --server-cert server.crt --server-key server.key
Available log levels (case-insensitive):
ERROR
WARNING
INFO
DEBUG
Available log formats (case-insensitive):
JSON
TEXT
Prompts can be configured in several ways:
-
Default prompts:
- Located in
prompts/default.yaml
- Loaded automatically if no other prompts are specified
- Located in
-
Configuration file:
# Option 1: Direct prompts definition prompts: my_prompt: "Custom prompt text" another_prompt: "Another prompt text" # Option 2: Reference to prompts file prompts: "path/to/prompts.yaml"
-
Environment variable:
export CODEGATE_PROMPTS_FILE=path/to/prompts.yaml
-
CLI flag:
codegate serve --prompts path/to/prompts.yaml
Prompts files are defined in YAML format with string values:
prompt_name: "Prompt text content"
another_prompt: "More prompt text"
multiline_prompt: |
This is a multi-line prompt.
It can span multiple lines.
Access prompts in code:
config = Config.load()
prompt = config.prompts.prompt_name
The configuration system uses a custom ConfigurationError
exception for
handling configuration-related errors, such as:
- Invalid port numbers (must be between 1 and 65535)
- Invalid proxy port numbers (must be between 1 and 65535)
- Invalid log levels
- Invalid log formats
- YAML parsing errors
- File reading errors
- Invalid prompt values (must be strings)
- Missing or invalid prompts files