Default Parameters
Note
- The purpose of this part of the documentation is to help understand what
addParams
anddropParams
do. You CANNOT globally configure the parameters and their values that LibeChat sends by default, it can only be configured within a single endpoint.
Custom endpoints share logic with the OpenAI endpoint, and thus have default parameters tailored to the OpenAI API.
Default Parameters
{
"model": "your-selected-model",
"temperature": 1,
"top_p": 1,
"presence_penalty": 0,
"frequency_penalty": 0,
"user": "LibreChat_User_ID",
"stream": true,
"messages": [
{
"role": "user",
"content": "hi how are you",
},
],
}
Breakdown
model
: The selected model from list of models.temperature
: Defaults to1
if not provided via preset,top_p
: Defaults to1
if not provided via preset,presence_penalty
: Defaults to0
if not provided via preset,frequency_penalty
: Defaults to0
if not provided via preset,user
: A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse.stream
: If set, partial message deltas will be sent, like in ChatGPT. Otherwise, generation will only be available when completed.messages
: OpenAI format for messages; thename
field is added to messages withsystem
andassistant
roles when a custom name is specified via preset.
Note: The max_tokens
field is not sent to use the maximum amount of tokens available, which is default OpenAI API behavior. Some alternate APIs require this field, or it may default to a very low value and your responses may appear cut off; in this case, you should add it to addParams
field as shown in the Custom Endpoint Object Structure.