Welcome to the guide for configuring the librechat.yaml file in LibreChat.

This file enables the integration of custom AI endpoints, enabling you to connect with any AI provider compliant with OpenAI API standards.

Key Features

  • Endpoint Integration: Seamlessly integrate with a variety of AI providers compliant with OpenAI API standards, including Mistral AI, reverse proxies, and more.
  • Advanced Customization: Configure file handling, rate limiting, user registration, and interface elements to align with your preferences and requirements.
  • Model Specifications: Define detailed model configurations, presets, and behaviors to deliver a tailored AI experience.
  • Assistants Integration: Leverage the power of OpenAI’s Assistants, with options to customize capabilities, polling intervals, and timeouts.
  • Azure OpenAI Support: Integrate with Azure OpenAI Service, enabling access to multiple deployments, region models, and serverless inference endpoints.

Future updates will streamline configuration further by migrating some settings from your .env file to librechat.yaml.

Stay tuned for ongoing enhancements to customize your LibreChat instance!

Note: To verify your YAML config, you can use the YAML Validator or other online tools like