* feat(bedrock-mantle): add IAM credential auth via @aws/bedrock-token-generator Mantle previously required a manually-created API key (AWS_BEARER_TOKEN_BEDROCK). This adds automatic bearer token generation from IAM credentials using the official @aws/bedrock-token-generator package. Auth priority: 1. Explicit AWS_BEARER_TOKEN_BEDROCK env var (manual API key from Console) 2. IAM credentials via getTokenProvider() → Bearer token (instance roles, SSO profiles, access keys, EKS IRSA, ECS task roles) Token is cached in memory (1hr TTL, generated with 2hr validity) and in process.env.AWS_BEARER_TOKEN_BEDROCK for downstream sync reads. Falls back gracefully when package is not installed or credentials are unavailable — Mantle provider simply not registered. Closes #45152 * fix(bedrock-mantle): harden IAM auth --------- Co-authored-by: Vincent Koc <vincentkoc@ieee.org>
3.2 KiB
summary, read_when, title
| summary | read_when | title | ||
|---|---|---|---|---|
| Use Amazon Bedrock Mantle (OpenAI-compatible) models with OpenClaw |
|
Amazon Bedrock Mantle |
Amazon Bedrock Mantle
OpenClaw includes a bundled Amazon Bedrock Mantle provider that connects to
the Mantle OpenAI-compatible endpoint. Mantle hosts open-source and
third-party models (GPT-OSS, Qwen, Kimi, GLM, and similar) through a standard
/v1/chat/completions surface backed by Bedrock infrastructure.
What OpenClaw supports
- Provider:
amazon-bedrock-mantle - API:
openai-completions(OpenAI-compatible) - Auth: explicit
AWS_BEARER_TOKEN_BEDROCKor IAM credential-chain bearer-token generation - Region:
AWS_REGIONorAWS_DEFAULT_REGION(default:us-east-1)
Automatic model discovery
When AWS_BEARER_TOKEN_BEDROCK is set, OpenClaw uses it directly. Otherwise,
OpenClaw attempts to generate a Mantle bearer token from the AWS default
credential chain, including shared credentials/config profiles, SSO, web
identity, and instance or task roles. It then discovers available Mantle
models by querying the region's /v1/models endpoint. Discovery results are
cached for 1 hour, and IAM-derived bearer tokens are refreshed hourly.
Supported regions: us-east-1, us-east-2, us-west-2, ap-northeast-1,
ap-south-1, ap-southeast-3, eu-central-1, eu-west-1, eu-west-2,
eu-south-1, eu-north-1, sa-east-1.
Onboarding
- Choose one auth path on the gateway host:
Explicit bearer token:
export AWS_BEARER_TOKEN_BEDROCK="..."
# Optional (defaults to us-east-1):
export AWS_REGION="us-west-2"
IAM credentials:
# Any AWS SDK-compatible auth source works here, for example:
export AWS_PROFILE="default"
export AWS_REGION="us-west-2"
- Verify models are discovered:
openclaw models list
Discovered models appear under the amazon-bedrock-mantle provider. No
additional config is required unless you want to override defaults.
Manual configuration
If you prefer explicit config instead of auto-discovery:
{
models: {
providers: {
"amazon-bedrock-mantle": {
baseUrl: "https://bedrock-mantle.us-east-1.api.aws/v1",
api: "openai-completions",
auth: "api-key",
apiKey: "env:AWS_BEARER_TOKEN_BEDROCK",
models: [
{
id: "gpt-oss-120b",
name: "GPT-OSS 120B",
reasoning: true,
input: ["text"],
cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
contextWindow: 32000,
maxTokens: 4096,
},
],
},
},
},
}
Notes
- OpenClaw can mint the Mantle bearer token for you from AWS SDK-compatible
IAM credentials when
AWS_BEARER_TOKEN_BEDROCKis not set. - The bearer token is the same
AWS_BEARER_TOKEN_BEDROCKused by the standard Amazon Bedrock provider. - Reasoning support is inferred from model IDs containing patterns like
thinking,reasoner, orgpt-oss-120b. - If the Mantle endpoint is unavailable or returns no models, the provider is silently skipped.