Custom Parameters
Picking A Default Parameters Set
By default, when you specify a custom endpoint in librechat.yaml
config file, it will use the default parameters of the OpenAI API. However, you can override these defaults by specifying the customParams.defaultParamsEndpoint
field within the definition of your custom endpoint. For example, to use Google parameters for your custom endpoint:
endpoints:
custom:
- name: 'Google Gemini'
apiKey: ...
baseURL: ...
customParams:
defaultParamsEndpoint: 'google'
Your “Google Gemini” endpoint will now display parameters for Google API when you create a new agent or preset.
Overriding Parameter Definitions
On top of that, you can also fine tune the parameters provided for your custom endpoint. For example, the temperature
parameter for google endpoint is a slide with range from 0.0 to 1.0, and default of 1.0, you can update the librechat.yaml
file to override these values:
endpoints:
custom:
- name: 'Google Gemini'
apiKey: ...
baseURL: ...
customParams:
defaultParamsEndpoint: 'google'
paramDefinitions:
- key: temperature
range:
min: 0
max: 0.7
step: 0.1
default: 0.5
As a result, the Temperature
slider will be limited to the range of 0.0
and 0.7
with step of 0.1
, and a default of 0.5
. The rest of the parameters will be set to their default values.
Anthropic
When using defaultParamsEndpoint: 'anthropic'
, the system provides special handling that goes beyond just displaying and using Anthropic parameter sets:
This is mainly necessary to properly format the thinking
parameter, which is not OpenAI-compatible:
{
"thinking": {
"type": "enabled",
"budget_tokens": 10000
}
}
Additionally, the system automatically adds model-specific Anthropic beta headers such as:
anthropic-beta: prompt-caching-2024-07-31
for prompt caching supportanthropic-beta: context-1m-2025-08-07
for extended context models- Model-specific feature flags based on the Claude model being used
This ensures full compatibility when routing through proxy services or directly to Anthropic-compatible endpoints.