AWS Bedrock
Head to the AWS docs to sign up for AWS and setup your credentials.
You’ll also need to turn on model access for your account, which you can do by following these instructions.
Authentication
- You will need to set the following environment variables:
Note: You can also omit the access keys in order to use the default AWS credentials chain but you must set the default region:
Doing so prompts the credential provider to find credentials from the following sources (listed in order of precedence):
- Environment variables exposed via process.env
- SSO credentials from token cache
- Web identity token credentials
- Shared credentials and config ini files
- The EC2/ECS Instance Metadata Service
The default credential provider will invoke one provider at a time and only continue to the next if no credentials have been located.
For example, if the process finds values defined via the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, the files at ~/.aws/credentials and ~/.aws/config will not be read, nor will any messages be sent to the Instance Metadata Service.
Configuring models
- You can optionally specify which models you want to make available with
BEDROCK_AWS_MODELS:
Note: If omitted, all known, supported model IDs will be included automatically.
- See all Bedrock model IDs here:
Additional Configuration
You can further configure the Bedrock endpoint in your librechat.yaml file:
-
streamRate: (Optional) Set the rate of processing each new token in milliseconds.- This can help stabilize processing of concurrent requests and provide smoother frontend stream rendering.
-
titleModel: (Optional) Specify the model to use for generating conversation titles.- Recommended:
anthropic.claude-3-haiku-20240307-v1:0. - Omit or set as
current_modelto use the same model as the chat.
- Recommended:
-
availableRegions: (Optional) Specify the AWS regions you want to make available.- If provided, users will see a dropdown to select the region. If not selected, the default region is used.
-
guardrailConfig: (Optional) Configure AWS Bedrock Guardrails for content filtering.guardrailIdentifier: The guardrail ID or ARN from your AWS Bedrock Console.guardrailVersion: The guardrail version number (e.g.,"1") or"DRAFT".trace: (Optional) Enable trace logging:"enabled","disabled", or"enabled_full".- See AWS Bedrock Guardrails documentation for creating and managing guardrails.
Inference Profiles
AWS Bedrock inference profiles let you create custom routing configurations for foundation models, enabling cross-region load balancing, cost allocation, and compliance controls. You can map model IDs to custom inference profile ARNs in your librechat.yaml:
For the full guide on creating profiles, configuring LibreChat, setting up logging, and troubleshooting, see Bedrock Inference Profiles.
For the YAML field reference, see AWS Bedrock Object Structure.
Notes
-
The following models are not supported due to lack of streaming capability:
- ai21.j2-mid-v1
-
The following models are not supported due to lack of conversation history support:
- ai21.j2-ultra-v1
- cohere.command-text-v14
- cohere.command-light-text-v14
-
AWS Bedrock endpoint supports all Shared Endpoint Settings via the
librechat.yamlconfiguration file, includingstreamRate,titleModel,titleMethod,titlePrompt,titlePromptTemplate, andtitleEndpoint
How is this guide?