CAPSEM Proxy Setup
The CAPSEM proxy provides transparent security monitoring and control for OpenAI and Google Gemini API requests. It acts as a drop-in replacement for your LLM API base URLs, enabling real-time security policy enforcement without modifying your application code.
Features
Section titled “Features”- Multi-Provider Support: Proxies both OpenAI and Google Gemini APIs
- Transparent Integration: Works as a drop-in replacement - just change the base URL
- Real-time Security: CAPSEM policies enforced at multiple interception points
- Streaming Support: Full support for SSE streaming responses
- Tool Calling: Transparent proxy for tool/function calling
- Multi-tenant: API keys passed through from clients, never stored server-side
Architecture
Section titled “Architecture”Your Application (OpenAI SDK / Gemini SDK) ↓CAPSEM Proxy ↓ Security Checks (prompt, tools, response) ↓OpenAI API / Gemini APIInstallation
Section titled “Installation”Setup a venv (recommended)
Section titled “Setup a venv (recommended)”While optinal, we recommend using a virtual environment as this will ensure that the dependencies for CAPSEM are isolated from the rest of your system.
python -m venv .venvsource .venv/bin/activate # On Windows use `.venv\Scripts\activate`Install CAPSEM Proxy
Section titled “Install CAPSEM Proxy”pip install capsem_proxyConfiguration
Section titled “Configuration”Security Policies
Section titled “Security Policies”Configure security policies in a config/ directory using TOML files. Each policy has its own configuration file.
Example config/debug.toml:
enabled = trueExample config/pii.toml:
enabled = true
[entity_decisions]EMAIL_ADDRESS = "BLOCK"CREDIT_CARD = "BLOCK"See the Policies Documentation for available policies and detailed configuration options.
Running the Proxy
Section titled “Running the Proxy”Start the proxy using the launcher:
# Start with default settings (uses config/ directory)python -m capsem_proxy.run_proxy
# Specify custom config directorypython -m capsem_proxy.run_proxy --config-dir /path/to/config
# Run on different portpython -m capsem_proxy.run_proxy --port 8080
# See all optionspython -m capsem_proxy.run_proxy --helpThe proxy will display enabled policies on startup:
============================================================ CAPSEM PROXY - Multi-tenant LLM Security Proxy============================================================ Host: 127.0.0.1 Port: 8000
Security Policies: Config Dir: config Enabled Policies: 2 ✓ Debug ✓ PIIDetection============================================================Verify Installation
Section titled “Verify Installation”Check that the proxy is running:
curl http://localhost:8000/healthYou should see:
{ "status": "healthy", "version": "0.1.0", "providers": ["openai", "gemini"]}Next Steps
Section titled “Next Steps”- OpenAI Proxy Tutorial - Learn how to proxy OpenAI API calls
- Gemini Proxy Tutorial - Learn how to proxy Google Gemini API calls
Troubleshooting
Section titled “Troubleshooting”Port Already in Use
Section titled “Port Already in Use”If port 8000 is already in use, specify a different port:
uvicorn capsem_proxy.server:app --host 127.0.0.1 --port 8080Remember to update your client’s base_url accordingly.
Connection Refused
Section titled “Connection Refused”Ensure the proxy is running and listening on the correct host/port. Check firewall settings if connecting from another machine.
API Key Errors
Section titled “API Key Errors”The proxy passes through authentication to the actual LLM providers. Ensure your API keys are valid and have the necessary permissions.