# Custom Parameters (/docs/configuration/librechat_yaml/object_structure/custom_params)

### Picking A Default Parameters Set

By default, when you specify a custom endpoint in `librechat.yaml` config file, it will use the default parameters of the OpenAI API. However, you can override these defaults by specifying the `customParams.defaultParamsEndpoint` field within the definition of your custom endpoint. For example, to use Google parameters for your custom endpoint:

```yaml filename="excerpt of librechat.yaml"
endpoints:
  custom:
    - name: 'Google Gemini'
      apiKey: ...
      baseURL: ...
      customParams:
        defaultParamsEndpoint: 'google'
```

Your "Google Gemini" endpoint will now display parameters for Google API when you create a new agent or preset.

### Overriding Parameter Definitions

On top of that, you can also fine tune the parameters provided for your custom endpoint. For example, the `temperature` parameter for google endpoint is a slide with range from 0.0 to 1.0, and default of 1.0, you can update the `librechat.yaml` file to override these values:

```yaml filename="excerpt of librechat.yaml"
endpoints:
  custom:
    - name: 'Google Gemini'
      apiKey: ...
      baseURL: ...
      customParams:
        defaultParamsEndpoint: 'google'
        paramDefinitions:
          - key: temperature
            range:
              min: 0
              max: 0.7
              step: 0.1
            default: 0.5
```
As a result, the `Temperature` slider will be limited to the range of `0.0` and `0.7` with step of `0.1`, and a default of `0.5`. The rest of the parameters will be set to their default values.

### Setting Default Parameter Values

You can specify default values for parameters that will be automatically applied when making API requests. This is useful for setting baseline parameter values for your custom endpoint without requiring users to manually configure them each time.

The `default` field in `paramDefinitions` allows you to set default values that are applied when parameters are undefined. These defaults follow a priority order to ensure proper override behavior:

**Priority Order (lowest to highest):**
1. **Default values from `paramDefinitions`** - Applied first when parameter is undefined
2. **`addParams`** - Can override default values
3. **User-configured `modelOptions`** - Highest priority, overrides everything

```yaml filename="excerpt of librechat.yaml"
endpoints:
  custom:
    - name: 'My Custom LLM'
      apiKey: ...
      baseURL: ...
      customParams:
        defaultParamsEndpoint: 'openAI'
        paramDefinitions:
          - key: temperature
            default: 0.7
          - key: topP
            default: 0.9
          - key: maxTokens
            default: 2000
```

In this example:
- If a user doesn't specify `temperature`, it defaults to `0.7`
- If a user explicitly sets `temperature` to `0.5`, their value (`0.5`) takes precedence
- The `addParams` field (if configured) can override these defaults
- User selections in the UI always have the highest priority

### Anthropic

When using `defaultParamsEndpoint: 'anthropic'`, the system provides special handling that goes beyond just displaying and using Anthropic parameter sets:

<Callout type="info">
**Anthropic API Compatibility**

Setting `defaultParamsEndpoint: 'anthropic'` enables full Anthropic API compatibility for parameters, headers, and payload formatting:
- Parameters are sent to your custom endpoint exactly as the Anthropic API expects
- This is essential for proxy services like LiteLLM that pass non-OpenAI-spec parameters directly to the underlying provider
- Anthropic-specific parameters like `thinking` are properly formatted
- The `messages` payload is formatted according to Anthropic's requirements (thinking blocks and prompt caching)
- Appropriate beta headers are automatically added based on the model as when using Anthropic directly
</Callout>

This is mainly necessary to properly format the `thinking` parameter, which is not OpenAI-compatible:

```json
{
  "thinking": {
    "type": "enabled",
    "budget_tokens": 10000
  }
}
```

Additionally, the system automatically adds model-specific Anthropic beta headers such as:
- `anthropic-beta: prompt-caching-2024-07-31` for prompt caching support
- `anthropic-beta: context-1m-2025-08-07` for extended context models
- Model-specific feature flags based on the Claude model being used

This ensures full compatibility when routing through proxy services or directly to Anthropic-compatible endpoints.

<Callout type="note">
**Implementation Status**

Currently, this automatic parameter and header handling is fully implemented for Anthropic endpoints. Similar behavior for other `defaultParamsEndpoint` values (e.g., `google`, `bedrock`) is planned for future updates.
</Callout>