ConfigurationLibreChat YAMLAI EndpointsLiteLLMExample configuration for LiteLLMCopy MarkdownOpenNotes: Reference Using LibreChat with LiteLLM Proxy for configuration. - name: "LiteLLM" apiKey: "sk-from-config-file" baseURL: "http://localhost:8000/v1" # if using LiteLLM example in docker-compose.override.yml.example, use "http://litellm:8000/v1" models: default: ["gpt-3.5-turbo"] fetch: true titleConvo: true titleModel: "gpt-3.5-turbo" summarize: false summaryModel: "gpt-3.5-turbo" modelDisplayLabel: "LiteLLM"How is this guide?GoodBadEdit on GitHubPreviousHuggingfaceNextMistral