Custom Parameters
Picking A Default Parameters Set
By default, when you specify a custom endpoint in librechat.yaml config file, it will use the default parameters of the OpenAI API. However, you can override these defaults by specifying the customParams.defaultParamsEndpoint field within the definition of your custom endpoint. For example, to use Google parameters for your custom endpoint:
endpoints:
custom:
- name: 'Google Gemini'
apiKey: ...
baseURL: ...
customParams:
defaultParamsEndpoint: 'google'Your “Google Gemini” endpoint will now display parameters for Google API when you create a new agent or preset.
Overriding Parameter Definitions
On top of that, you can also fine tune the parameters provided for your custom endpoint. For example, the temperature parameter for google endpoint is a slide with range from 0.0 to 1.0, and default of 1.0, you can update the librechat.yaml file to override these values:
endpoints:
custom:
- name: 'Google Gemini'
apiKey: ...
baseURL: ...
customParams:
defaultParamsEndpoint: 'google'
paramDefinitions:
- key: temperature
range:
min: 0
max: 0.7
step: 0.1
default: 0.5As a result, the Temperature slider will be limited to the range of 0.0 and 0.7 with step of 0.1, and a default of 0.5. The rest of the parameters will be set to their default values.
Setting Default Parameter Values
You can specify default values for parameters that will be automatically applied when making API requests. This is useful for setting baseline parameter values for your custom endpoint without requiring users to manually configure them each time.
The default field in paramDefinitions allows you to set default values that are applied when parameters are undefined. These defaults follow a priority order to ensure proper override behavior:
Priority Order (lowest to highest):
- Default values from
paramDefinitions- Applied first when parameter is undefined addParams- Can override default values- User-configured
modelOptions- Highest priority, overrides everything
endpoints:
custom:
- name: 'My Custom LLM'
apiKey: ...
baseURL: ...
customParams:
defaultParamsEndpoint: 'openAI'
paramDefinitions:
- key: temperature
default: 0.7
- key: topP
default: 0.9
- key: maxTokens
default: 2000In this example:
- If a user doesn’t specify
temperature, it defaults to0.7 - If a user explicitly sets
temperatureto0.5, their value (0.5) takes precedence - The
addParamsfield (if configured) can override these defaults - User selections in the UI always have the highest priority
Anthropic
When using defaultParamsEndpoint: 'anthropic', the system provides special handling that goes beyond just displaying and using Anthropic parameter sets:
This is mainly necessary to properly format the thinking parameter, which is not OpenAI-compatible:
{
"thinking": {
"type": "enabled",
"budget_tokens": 10000
}
}Additionally, the system automatically adds model-specific Anthropic beta headers such as:
anthropic-beta: prompt-caching-2024-07-31for prompt caching supportanthropic-beta: context-1m-2025-08-07for extended context models- Model-specific feature flags based on the Claude model being used
This ensures full compatibility when routing through proxy services or directly to Anthropic-compatible endpoints.