Custom Parameters
Picking A Default Parameters Set
By default, when you specify a custom endpoint in librechat.yaml config file, it will use the default parameters of the OpenAI API. However, you can override these defaults by specifying the customParams.defaultParamsEndpoint field within the definition of your custom endpoint. For example, to use Google parameters for your custom endpoint:
Your "Google Gemini" endpoint will now display parameters for Google API when you create a new agent or preset.
Overriding Parameter Definitions
On top of that, you can also fine tune the parameters provided for your custom endpoint. For example, the temperature parameter for google endpoint is a slide with range from 0.0 to 1.0, and default of 1.0, you can update the librechat.yaml file to override these values:
As a result, the Temperature slider will be limited to the range of 0.0 and 0.7 with step of 0.1, and a default of 0.5. The rest of the parameters will be set to their default values.
Setting Default Parameter Values
You can specify default values for parameters that will be automatically applied when making API requests. This is useful for setting baseline parameter values for your custom endpoint without requiring users to manually configure them each time.
The default field in paramDefinitions allows you to set default values that are applied when parameters are undefined. These defaults follow a priority order to ensure proper override behavior:
Priority Order (lowest to highest):
- Default values from
paramDefinitions- Applied first when parameter is undefined addParams- Can override default values- User-configured
modelOptions- Highest priority, overrides everything
In this example:
- If a user doesn't specify
temperature, it defaults to0.7 - If a user explicitly sets
temperatureto0.5, their value (0.5) takes precedence - The
addParamsfield (if configured) can override these defaults - User selections in the UI always have the highest priority
Anthropic
When using defaultParamsEndpoint: 'anthropic', the system provides special handling that goes beyond just displaying and using Anthropic parameter sets:
Anthropic API Compatibility
Setting defaultParamsEndpoint: 'anthropic' enables full Anthropic API compatibility for parameters, headers, and payload formatting:
- Parameters are sent to your custom endpoint exactly as the Anthropic API expects
- This is essential for proxy services like LiteLLM that pass non-OpenAI-spec parameters directly to the underlying provider
- Anthropic-specific parameters like
thinkingare properly formatted - The
messagespayload is formatted according to Anthropic's requirements (thinking blocks and prompt caching) - Appropriate beta headers are automatically added based on the model as when using Anthropic directly
This is mainly necessary to properly format the thinking parameter, which is not OpenAI-compatible:
Additionally, the system automatically adds model-specific Anthropic beta headers such as:
anthropic-beta: prompt-caching-2024-07-31for prompt caching supportanthropic-beta: context-1m-2025-08-07for extended context models- Model-specific feature flags based on the Claude model being used
This ensures full compatibility when routing through proxy services or directly to Anthropic-compatible endpoints.
Implementation Status
Currently, this automatic parameter and header handling is fully implemented for Anthropic endpoints. Similar behavior for other defaultParamsEndpoint values (e.g., google, bedrock) is planned for future updates.
How is this guide?