This guide provides comprehensive information for developers working on the CodeGate project.
CodeGate is a configurable generative AI gateway designed to protect developers from potential AI-related security risks. Key features include:
- Secrets exfiltration prevention
- Secure coding recommendations
- Prevention of AI recommending deprecated/malicious libraries
- Modular system prompts configuration
- Multiple AI provider support with configurable endpoints
- Python 3.11 or higher
- Poetry for dependency management
- Docker or Podman (for containerized deployment)
- Visual Studio Code (recommended IDE)
-
Clone the repository:
git clone https://github.com/stacklok/codegate.git cd codegate
-
Install Poetry following the official installation guide
-
Install project dependencies:
poetry install --with dev
Clone the repository
git clone https://github.com/stacklok/codegate-ui
cd codegate-ui
To install all dependencies for your local development environment, run
npm install
Run the development server using:
npm run dev
Run the build command:
npm run build
Run the production build command:
npm run preview
codegate/
├── pyproject.toml # Project configuration and dependencies
├── poetry.lock # Lock file (committed to version control)
├── prompts/ # System prompts configuration
│ └── default.yaml # Default system prompts
├── src/
│ └── codegate/ # Source code
│ ├── __init__.py
│ ├── cli.py # Command-line interface
│ ├── config.py # Configuration management
│ ├── exceptions.py # Shared exceptions
│ ├── codegate_logging.py # Logging setup
│ ├── prompts.py # Prompts management
│ ├── server.py # Main server implementation
│ └── providers/ # External service providers
│ ├── anthropic/ # Anthropic provider implementation
│ ├── openai/ # OpenAI provider implementation
│ ├── vllm/ # vLLM provider implementation
│ └── base.py # Base provider interface
├── tests/ # Test files
└── docs/ # Documentation
Poetry commands for managing your development environment:
poetry install
: Install project dependenciespoetry add package-name
: Add a new package dependencypoetry add --group dev package-name
: Add a development dependencypoetry remove package-name
: Remove a packagepoetry update
: Update dependencies to their latest versionspoetry show
: List all installed packagespoetry env info
: Show information about the virtual environment
The project uses several tools to maintain code quality:
-
Black for code formatting:
poetry run black .
-
Ruff for linting:
poetry run ruff check .
-
Bandit for security checks:
poetry run bandit -r src/
Run the test suite with coverage:
poetry run pytest
Tests are located in the tests/
directory and follow the same structure as the
source code.
The project includes a Makefile for common development tasks:
make install
: install all dependenciesmake format
: format code using black and ruffmake lint
: run linting checksmake test
: run tests with coveragemake security
: run security checksmake build
: build distribution packagesmake all
: run all checks and build (recommended before committing)
CodeGate uses a hierarchical configuration system with the following priority (highest to lowest):
- CLI arguments
- Environment variables
- Config file (YAML)
- Default values (including default prompts)
- Port: server port (default:
8989
) - Host: server host (default:
"localhost"
) - Log level: logging verbosity level (
ERROR
|WARNING
|INFO
|DEBUG
) - Log format: log format (
JSON
|TEXT
) - Prompts: system prompts configuration
- Provider URLs: AI provider endpoint configuration
See Configuration system for detailed information.
CodeGate supports multiple AI providers through a modular provider system.
-
vLLM provider
- Default URL:
http://localhost:8000
- Supports OpenAI-compatible APIs
- Automatically adds
/v1
path to base URL - Model names are prefixed with
hosted_vllm/
- Default URL:
-
OpenAI provider
- Default URL:
https://api.openai.com/v1
- Standard OpenAI API implementation
- Default URL:
-
Anthropic provider
- Default URL:
https://api.anthropic.com/v1
- Anthropic Claude API implementation
- Default URL:
-
Ollama provider
- Default URL:
http://localhost:11434
- Endpoints:
- Native Ollama API:
/ollama/api/chat
- OpenAI-compatible:
/ollama/chat/completions
- Native Ollama API:
- Default URL:
Provider URLs can be configured through:
-
Config file (config.yaml):
provider_urls: vllm: "https://vllm.example.com" openai: "https://api.openai.com/v1" anthropic: "https://api.anthropic.com/v1" ollama: "http://localhost:11434" # /api path added automatically
-
Environment variables:
export CODEGATE_PROVIDER_VLLM_URL=https://vllm.example.com export CODEGATE_PROVIDER_OPENAI_URL=https://api.openai.com/v1 export CODEGATE_PROVIDER_ANTHROPIC_URL=https://api.anthropic.com/v1 export CODEGATE_PROVIDER_OLLAMA_URL=http://localhost:11434
-
CLI flags:
codegate serve --vllm-url https://vllm.example.com --ollama-url http://localhost:11434
To add a new provider:
- Create a new directory in
src/codegate/providers/
- Implement required components:
provider.py
: Main provider class extending BaseProvideradapter.py
: Input/output normalizers__init__.py
: Export provider class
Example structure:
from codegate.providers.base import BaseProvider
class NewProvider(BaseProvider):
def __init__(self, ...):
super().__init__(
InputNormalizer(),
OutputNormalizer(),
completion_handler,
pipeline_processor,
fim_pipeline_processor
)
@property
def provider_route_name(self) -> str:
return "provider_name"
def _setup_routes(self):
# Implement route setup
pass
Default prompts are stored in prompts/default.yaml
. These prompts are loaded
automatically when no other prompts are specified.
-
Create a new YAML file following the format:
prompt_name: "Prompt text content" another_prompt: "More prompt text"
-
Use the prompts file:
# Via CLI codegate serve --prompts my-prompts.yaml # Via config.yaml prompts: "path/to/prompts.yaml" # Via environment export CODEGATE_PROMPTS_FILE=path/to/prompts.yaml
-
View loaded prompts:
# Show default prompts codegate show-prompts # Show custom prompts codegate show-prompts --prompts my-prompts.yaml
-
Write tests for prompt functionality:
def test_custom_prompts(): config = Config.load(prompts_path="path/to/test/prompts.yaml") assert config.prompts.my_prompt == "Expected prompt text"
The main command-line interface is implemented in cli.py
. Basic usage:
# Start server with default settings
codegate serve
# Start with custom configuration
codegate serve --port 8989 --host localhost --log-level DEBUG
# Start with custom prompts
codegate serve --prompts my-prompts.yaml
# Start with custom provider URL
codegate serve --vllm-url https://vllm.example.com
See CLI commands and flags for detailed command information.