Skip to content

CAPSEM Proxy Setup

The CAPSEM proxy provides transparent security monitoring and control for OpenAI and Google Gemini API requests. It acts as a drop-in replacement for your LLM API base URLs, enabling real-time security policy enforcement without modifying your application code.

  • Multi-Provider Support: Proxies both OpenAI and Google Gemini APIs
  • Transparent Integration: Works as a drop-in replacement - just change the base URL
  • Real-time Security: CAPSEM policies enforced at multiple interception points
  • Streaming Support: Full support for SSE streaming responses
  • Tool Calling: Transparent proxy for tool/function calling
  • Multi-tenant: API keys passed through from clients, never stored server-side
Your Application (OpenAI SDK / Gemini SDK)
CAPSEM Proxy
↓ Security Checks (prompt, tools, response)
OpenAI API / Gemini API

While optinal, we recommend using a virtual environment as this will ensure that the dependencies for CAPSEM are isolated from the rest of your system.

Terminal window
python -m venv .venv
source .venv/bin/activate # On Windows use `.venv\Scripts\activate`
Terminal window
pip install capsem_proxy

Configure security policies in a config/ directory using TOML files. Each policy has its own configuration file.

Example config/debug.toml:

enabled = true

Example config/pii.toml:

enabled = true
[entity_decisions]
EMAIL_ADDRESS = "BLOCK"
CREDIT_CARD = "BLOCK"

See the Policies Documentation for available policies and detailed configuration options.

Start the proxy using the launcher:

Terminal window
# Start with default settings (uses config/ directory)
python -m capsem_proxy.run_proxy
# Specify custom config directory
python -m capsem_proxy.run_proxy --config-dir /path/to/config
# Run on different port
python -m capsem_proxy.run_proxy --port 8080
# See all options
python -m capsem_proxy.run_proxy --help

The proxy will display enabled policies on startup:

============================================================
CAPSEM PROXY - Multi-tenant LLM Security Proxy
============================================================
Host: 127.0.0.1
Port: 8000
Security Policies:
Config Dir: config
Enabled Policies: 2
✓ Debug
✓ PIIDetection
============================================================

Check that the proxy is running:

Terminal window
curl http://localhost:8000/health

You should see:

{
"status": "healthy",
"version": "0.1.0",
"providers": ["openai", "gemini"]
}

If port 8000 is already in use, specify a different port:

Terminal window
uvicorn capsem_proxy.server:app --host 127.0.0.1 --port 8080

Remember to update your client’s base_url accordingly.

Ensure the proxy is running and listening on the correct host/port. Check firewall settings if connecting from another machine.

The proxy passes through authentication to the actual LLM providers. Ensure your API keys are valid and have the necessary permissions.