6.8 KiB
| summary | read_when | ||
|---|---|---|---|
| Use Amazon Bedrock (Converse API) models with OpenClaw |
|
Amazon Bedrock
OpenClaw can use Amazon Bedrock models via pi‑ai’s Bedrock Converse streaming provider. Bedrock auth uses the AWS SDK default credential chain, not an API key.
What pi‑ai supports
- Provider:
amazon-bedrock - API:
bedrock-converse-stream - Auth: AWS credentials (env vars, shared config, or instance role)
- Region:
AWS_REGIONorAWS_DEFAULT_REGION(default:us-east-1)
Automatic model discovery
If AWS credentials are detected, OpenClaw can automatically discover Bedrock
models that support streaming and text output. Discovery uses
bedrock:ListFoundationModels and bedrock:ListInferenceProfiles, and is
cached (default: 1 hour).
Config options live under models.bedrockDiscovery:
{
models: {
bedrockDiscovery: {
enabled: true,
region: "us-east-1",
providerFilter: ["anthropic", "amazon"],
refreshInterval: 3600,
defaultContextWindow: 32000,
defaultMaxTokens: 4096,
includeInferenceProfiles: true
}
}
}
Notes:
enableddefaults totruewhen AWS credentials are present.regiondefaults toAWS_REGIONorAWS_DEFAULT_REGION, thenus-east-1.providerFiltermatches Bedrock provider names (for exampleanthropic).refreshIntervalis seconds; set to0to disable caching.defaultContextWindow(default:32000) anddefaultMaxTokens(default:4096) are used for discovered models (override if you know your model limits).includeInferenceProfiles(default:true) enables discovery of inference profiles.
Inference Profiles
AWS Bedrock inference profiles provide cross-region routing and improved
availability for foundation models. When you use an inference profile ID
(like us.anthropic.claude-opus-4-5-20251101-v1:0), Bedrock can automatically
route your request to the best available region.
Benefits:
- Higher availability: Requests route to healthy regions automatically
- Lower latency: Traffic goes to the nearest available region
- Same API: Works identically to base model IDs
Discovery behavior:
- Moltbot discovers both base models and inference profiles by default
- Inference profiles inherit metadata (context window, reasoning support, etc.) from their base model
- Both appear in
moltbot models listoutput - Use either ID interchangeably in your configuration
Example:
# List all discovered models (includes inference profiles)
moltbot models list
# Use an inference profile in config
moltbot config set agents.defaults.model.primary \
"amazon-bedrock/us.anthropic.claude-opus-4-5-20251101-v1:0"
To disable inference profile discovery:
{
models: {
bedrockDiscovery: {
includeInferenceProfiles: false
}
}
}
Setup (manual)
- Ensure AWS credentials are available on the gateway host:
export AWS_ACCESS_KEY_ID="AKIA..."
export AWS_SECRET_ACCESS_KEY="..."
export AWS_REGION="us-east-1"
# Optional:
export AWS_SESSION_TOKEN="..."
export AWS_PROFILE="your-profile"
# Optional (Bedrock API key/bearer token):
export AWS_BEARER_TOKEN_BEDROCK="..."
- Add a Bedrock provider and model to your config (no
apiKeyrequired):
{
models: {
providers: {
"amazon-bedrock": {
baseUrl: "https://bedrock-runtime.us-east-1.amazonaws.com",
api: "bedrock-converse-stream",
auth: "aws-sdk",
models: [
{
id: "anthropic.claude-opus-4-5-20251101-v1:0",
name: "Claude Opus 4.5 (Bedrock)",
reasoning: true,
input: ["text", "image"],
cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
contextWindow: 200000,
maxTokens: 8192
}
]
}
}
},
agents: {
defaults: {
model: { primary: "amazon-bedrock/anthropic.claude-opus-4-5-20251101-v1:0" }
}
}
}
EC2 Instance Roles
When running OpenClaw on an EC2 instance with an IAM role attached, the AWS SDK will automatically use the instance metadata service (IMDS) for authentication. However, OpenClaw's credential detection currently only checks for environment variables, not IMDS credentials.
Workaround: Set AWS_PROFILE=default to signal that AWS credentials are
available. The actual authentication still uses the instance role via IMDS.
# Add to ~/.bashrc or your shell profile
export AWS_PROFILE=default
export AWS_REGION=us-east-1
Required IAM permissions for the EC2 instance role:
bedrock:InvokeModelbedrock:InvokeModelWithResponseStreambedrock:ListFoundationModels(for automatic discovery)
Or attach the managed policy AmazonBedrockFullAccess.
Quick setup:
# 1. Create IAM role and instance profile
aws iam create-role --role-name EC2-Bedrock-Access \
--assume-role-policy-document '{
"Version": "2012-10-17",
"Statement": [{
"Effect": "Allow",
"Principal": {"Service": "ec2.amazonaws.com"},
"Action": "sts:AssumeRole"
}]
}'
aws iam attach-role-policy --role-name EC2-Bedrock-Access \
--policy-arn arn:aws:iam::aws:policy/AmazonBedrockFullAccess
aws iam create-instance-profile --instance-profile-name EC2-Bedrock-Access
aws iam add-role-to-instance-profile \
--instance-profile-name EC2-Bedrock-Access \
--role-name EC2-Bedrock-Access
# 2. Attach to your EC2 instance
aws ec2 associate-iam-instance-profile \
--instance-id i-xxxxx \
--iam-instance-profile Name=EC2-Bedrock-Access
# 3. On the EC2 instance, enable discovery
openclaw config set models.bedrockDiscovery.enabled true
openclaw config set models.bedrockDiscovery.region us-east-1
# 4. Set the workaround env vars
echo 'export AWS_PROFILE=default' >> ~/.bashrc
echo 'export AWS_REGION=us-east-1' >> ~/.bashrc
source ~/.bashrc
# 5. Verify models are discovered
openclaw models list
Notes
- Bedrock requires model access enabled in your AWS account/region.
- Automatic discovery needs the
bedrock:ListFoundationModelsandbedrock:ListInferenceProfilespermissions. - If you use profiles, set
AWS_PROFILEon the gateway host. - OpenClaw surfaces the credential source in this order:
AWS_BEARER_TOKEN_BEDROCK, thenAWS_ACCESS_KEY_ID+AWS_SECRET_ACCESS_KEY, thenAWS_PROFILE, then the default AWS SDK chain. - Reasoning support depends on the model; check the Bedrock model card for current capabilities.
- Inference profiles use the same authentication and permissions as base models.
- If inference profile discovery fails, base model discovery continues normally.
- If you prefer a managed key flow, you can also place an OpenAI‑compatible proxy in front of Bedrock and configure it as an OpenAI provider instead.