Docs
Configuration
AI Providers
AWS Bedrock

AWS Bedrock

Head to the AWS docs to sign up for AWS and setup your credentials.

You’ll also need to turn on model access for your account, which you can do by following these instructions.

Authentication

  • You will need to set the following environment variables:
.env
BEDROCK_AWS_DEFAULT_REGION=us-east-1
BEDROCK_AWS_ACCESS_KEY_ID=your_access_key_id
BEDROCK_AWS_SECRET_ACCESS_KEY=your_secret_access_key

Note: You can also omit the access keys in order to use the default AWS credentials chain but you must set the default region:

.env
BEDROCK_AWS_DEFAULT_REGION=us-east-1

Doing so prompts the credential provider to find credentials from the following sources (listed in order of precedence):

  • Environment variables exposed via process.env
  • SSO credentials from token cache
  • Web identity token credentials
  • Shared credentials and config ini files
  • The EC2/ECS Instance Metadata Service

The default credential provider will invoke one provider at a time and only continue to the next if no credentials have been located.

For example, if the process finds values defined via the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, the files at ~/.aws/credentials and ~/.aws/config will not be read, nor will any messages be sent to the Instance Metadata Service.

Configuring models

  • You can optionally specify which models you want to make available with BEDROCK_AWS_MODELS:
.env
BEDROCK_AWS_MODELS=anthropic.claude-3-5-sonnet-20240620-v1:0,meta.llama3-1-8b-instruct-v1:0

Note: If omitted, all known, supported model IDs will be included automatically.

Additional Configuration

You can further configure the Bedrock endpoint in your librechat.yaml file:

endpoints:
  bedrock:
    availableRegions:
      - "us-east-1"
      - "us-west-2"
    streamRate: 35
    titleModel: 'anthropic.claude-3-haiku-20240307-v1:0'
  • streamRate: (Optional) Set the rate of processing each new token in milliseconds.

    • This can help stabilize processing of concurrent requests and provide smoother frontend stream rendering.
  • titleModel: (Optional) Specify the model to use for generating conversation titles.

    • Recommended: anthropic.claude-3-haiku-20240307-v1:0.
    • Omit or set as current_model to use the same model as the chat.
  • availableRegions: (Optional) Specify the AWS regions you want to make available.

    • If provided, users will see a dropdown to select the region. If not selected, the default region is used.
    • image

Notes

  • The following models are not supported due to lack of streaming capability:

    • ai21.j2-mid-v1
  • The following models are not supported due to lack of conversation history support:

    • ai21.j2-ultra-v1
    • cohere.command-text-v14
    • cohere.command-light-text-v14