Docs
⚙️ Configuration
librechat.yaml
Custom AI Endpoints
TrueFoundry AI Gateway

TrueFoundry AI Gateway is the proxy layer that sits between your applications and the LLM providers and MCP Servers. It is an enterprise-grade platform that enables users to access 1000+ LLMs using a unified interface while taking care of observability and governance.

Configuration Details

To use TrueFoundry’s AI Gateway follow the quick start guide here.

Getting Base URL and Model Names

  • Base URL and Model Names: Get your TrueFoundry AI Gateway endpoint URL and model name from the unified code snippet (ensure you have added the same model name)
  • PAT: Generate from your TrueFoundry Personal Access Token PAT

Using TrueFoundry AI Gateway with any LLM model

librechat.yaml
      - name: "TrueFoundry"
        apiKey: "${TRUEFOUNDRY_API_KEY}"
        baseURL: "${TRUEFOUNDRY_GATEWAY_URL}"
        models:
            default: ["openai-main/gpt-4o-mini", "openai-main/gpt-4o"]
            fetch: true
        titleConvo: true
        titleModel: "current_model"
        summarize: false
        summaryModel: "current_model"
        forcePrompt: false
        modelDisplayLabel: "TrueFoundry:OpenAI"

For more details you can check: TrueFoundry Docs