A unified, provider-agnostic chat completions API server supporting OpenAI and AWS Bedrock
A unified, provider-agnostic chat completions API server supporting OpenAI and AWS Bedrock.
Get started now View on GitHub
# Clone and install
git clone https://github.com/teabranch/open-bedrock-server.git
cd open-bedrock-server
uv pip install -e .
# Configure environment
bedrock-chat config set
# Start server
bedrock-chat serve --host 0.0.0.0 --port 8000
# Test the unified endpoint
curl -X POST http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-api-key" \
-d '{
"model": "gpt-4o-mini",
"messages": [{"role": "user", "content": "Hello!"}],
"stream": false
}'
# Upload a file
curl -X POST http://localhost:8000/v1/files \
-H "Authorization: Bearer your-api-key" \
-F "file=@data.csv" \
-F "purpose=assistants"
# Use file in chat completion
curl -X POST http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-api-key" \
-d '{
"model": "gpt-4o-mini",
"messages": [{"role": "user", "content": "Analyze this data"}],
"file_ids": ["file-abc123def456"]
}'
The /v1/chat/completions
endpoint is the only endpoint you need. It:
All format combinations are supported through the unified endpoint:
Input Format | Output Format | Use Case | Streaming |
---|---|---|---|
OpenAI | OpenAI | Standard OpenAI usage | ✅ |
OpenAI | Bedrock Claude | OpenAI clients → Bedrock response | ✅ |
OpenAI | Bedrock Titan | OpenAI clients → Titan response | ✅ |
Bedrock Claude | OpenAI | Bedrock clients → OpenAI response | ✅ |
Bedrock Claude | Bedrock Claude | Claude format preserved | ✅ |
Bedrock Titan | OpenAI | Titan clients → OpenAI response | ✅ |
Bedrock Titan | Bedrock Titan | Titan format preserved | ✅ |
/v1/chat/completions
handles everythingfile_ids
parameter to include file content as context in conversationsThis project is licensed under the MIT License - see the LICENSE file for details.