Quick Start
Custom Endpoints

Custom Endpoints

LibreChat supports OpenAI API compatible services using the librechat.yaml configuration file.

This guide assumes you have already set up LibreChat using Docker, as shown in the Local Setup Guide.

Step 1. Create or Edit a Docker Override File

  • Create a file named docker-compose.override.yml file at the project root (if it doesn’t already exist).
  • Add the following content to the file:
    - type: bind
      source: ./librechat.yaml
      target: /app/librechat.yaml

Learn more about the Docker Compose Override File here.

Step 2. Configure librechat.yaml

  • Create a file named librechat.yaml at the project root (if it doesn’t already exist).

  • Add your custom endpoints: you can view compatible endpoints in the AI Endpoints section.

    • The list is not exhaustive and generally every OpenAI API-compatible service should work.
    • There are many options for Custom Endpoints. View them all here: Custom Endpoint Object Structure.
  • As an example, here is a configuration for both OpenRouter and Ollama:

    version: 1.1.4
    cache: true
        - name: "OpenRouter"
          apiKey: "${OPENROUTER_KEY}"
          baseURL: ""
            default: ["gpt-3.5-turbo"]
            fetch: true
          titleConvo: true
          titleModel: "current_model"
          summarize: false
          summaryModel: "current_model"
          forcePrompt: false
          modelDisplayLabel: "OpenRouter"
        - name: "Ollama"
          apiKey: "ollama"
          baseURL: "http://host.docker.internal:11434/v1/"
            default: [
            fetch: true # fetching list of models is not supported
          titleConvo: true
          titleModel: "current_model"

Step 3. Configure .env File

  • Edit your existing .env file at the project root
    • Copy .env.example and rename to .env if it doesn’t already exist.
  • According to the config above, the environment variable OPENROUTER_KEY is expected and should be set:


  • As way of example, this guide assumes you have setup Ollama independently and is accessible to you at http://host.docker.internal:11434
    • ”host.docker.internal” is a special DNS name that resolves to the internal IP address used by the host.
    • You may need to change this to the actual IP address of your Ollama instance.
  • In a future guide, we will go into setting up Ollama along with LibreChat.

Step 4. Run the App

  • Now that your files are configured, you can run the app:
docker compose up

Or, if you were running the app before, you can restart the app with:

docker compose restart

Note: Make sure your Docker Desktop or Docker Engine is running before executing the command.


That’s it! You have now configured Custom Endpoints for your LibreChat instance.

Additional Links

Explore more about LibreChat and how to configure it to your needs.