Providers

Amazon Bedrock Mantle

OpenClaw includes a bundled Amazon Bedrock Mantle provider that connects to the Mantle OpenAI-compatible endpoint. Mantle hosts open-source and third-party models (GPT-OSS, Qwen, Kimi, GLM, and similar) through a standard /v1/chat/completions surface backed by Bedrock infrastructure.

Property Value
Provider ID amazon-bedrock-mantle
API openai-completions (OpenAI-compatible) or anthropic-messages (Anthropic Messages route)
Auth Explicit AWS_BEARER_TOKEN_BEDROCK or IAM credential-chain bearer-token generation
Default region us-east-1 (override with AWS_REGION or AWS_DEFAULT_REGION)

Getting started

Choose your preferred auth method and follow the setup steps.

Explicit bearer token

Best for: environments where you already have a Mantle bearer token.

  • Set the bearer token on the gateway host

    export AWS_BEARER_TOKEN_BEDROCK="..."
    

    Optionally set a region (defaults to us-east-1):

    export AWS_REGION="us-west-2"
    
  • Verify models are discovered

    openclaw models list
    

    Discovered models appear under the amazon-bedrock-mantle provider. No additional config is required unless you want to override defaults.

  • IAM credentials

    Best for: using AWS SDK-compatible credentials (shared config, SSO, web identity, instance or task roles).

  • Configure AWS credentials on the gateway host

    Any AWS SDK-compatible auth source works:

    export AWS_PROFILE="default"
    export AWS_REGION="us-west-2"
    
  • Verify models are discovered

    openclaw models list
    

    OpenClaw generates a Mantle bearer token from the credential chain automatically.

  • Automatic model discovery

    When AWS_BEARER_TOKEN_BEDROCK is set, OpenClaw uses it directly. Otherwise, OpenClaw attempts to generate a Mantle bearer token from the AWS default credential chain. It then discovers available Mantle models by querying the region's /v1/models endpoint.

    Behavior Detail
    Discovery cache Results cached for 1 hour
    IAM token refresh Hourly

    Supported regions

    us-east-1, us-east-2, us-west-2, ap-northeast-1, ap-south-1, ap-southeast-3, eu-central-1, eu-west-1, eu-west-2, eu-south-1, eu-north-1, sa-east-1.

    Manual configuration

    If you prefer explicit config instead of auto-discovery:

    {
      models: {
        providers: {
          "amazon-bedrock-mantle": {
            baseUrl: "https://bedrock-mantle.us-east-1.api.aws/v1",
            api: "openai-completions",
            auth: "api-key",
            apiKey: "env:AWS_BEARER_TOKEN_BEDROCK",
            models: [
              {
                id: "gpt-oss-120b",
                name: "GPT-OSS 120B",
                reasoning: true,
                input: ["text"],
                cost: { input: 0, output: 0, cacheRead: 0, cacheWrite: 0 },
                contextWindow: 32000,
                maxTokens: 4096,
              },
            ],
          },
        },
      },
    }
    

    Advanced configuration

    Reasoning support

    Reasoning support is inferred from model IDs containing patterns like thinking, reasoner, or gpt-oss-120b. OpenClaw sets reasoning: true automatically for matching models during discovery.

    Endpoint unavailability

    If the Mantle endpoint is unavailable or returns no models, the provider is silently skipped. OpenClaw does not error; other configured providers continue to work normally.

    Claude Opus 4.7 via the Anthropic Messages route

    Mantle also exposes an Anthropic Messages route that carries Claude models through the same bearer-authenticated streaming path. Claude Opus 4.7 (amazon-bedrock-mantle/claude-opus-4.7) is callable through this route with provider-owned streaming, so AWS bearer tokens are not treated like Anthropic API keys.

    When you pin an Anthropic Messages model on the Mantle provider, OpenClaw uses the anthropic-messages API surface instead of openai-completions for that model. Auth still comes from AWS_BEARER_TOKEN_BEDROCK (or the minted IAM bearer token).

    {
      models: {
        providers: {
          "amazon-bedrock-mantle": {
            models: [
              {
                id: "claude-opus-4.7",
                name: "Claude Opus 4.7",
                api: "anthropic-messages",
                reasoning: true,
                input: ["text", "image"],
                contextWindow: 1000000,
                maxTokens: 32000,
              },
            ],
          },
        },
      },
    }
    
    Relationship to Amazon Bedrock provider

    Bedrock Mantle is a separate provider from the standard Amazon Bedrock provider. Mantle uses an OpenAI-compatible /v1 surface, while the standard Bedrock provider uses the native Bedrock API.

    Both providers share the same AWS_BEARER_TOKEN_BEDROCK credential when present.