Docs
Configuration
librechat.yaml
Custom AI Endpoints
Portkey AI

Portkey Docs

Portkey API key: app.portkey.ai

Notes:

  • LibreChat requires that the API key field is present. We don’t need it for the Portkey integration, we can pass a dummy string for it.
  • Portkey integrates with LibreChat, offering observability, 50+ guardrails, caching, and conditional routing with fallbacks and retries for reliable, production-grade deployments.
  • Portkey Supports 250+ Models, you can find the list of providers on Portkey Docs
  • You can use Portkey in two ways—through Virtual Keys or Configs.
  • You can use Model specific parameters like top_p, max Tokens, etc. in using Portke Config. Learn more

Using Portkey AI with Virtual Keys

librechat.yaml
      - name: "Portkey"
        apiKey: "dummy"  
        baseURL: ${PORTKEY_GATEWAY_URL}
        headers:
            x-portkey-api-key: "${PORTKEY_API_KEY}"
            x-portkey-virtual-key: "PORTKEY_OPENAI_VIRTUAL_KEY"
        models:
            default: ["gpt-4o-mini"]
            fetch: true
        titleConvo: true
        titleModel: "current_model"
        summarize: false
        summaryModel: "current_model"
        forcePrompt: false
        modelDisplayLabel: "Portkey:OpenAI"
        iconURL: https://images.crunchbase.com/image/upload/c_pad,f_auto,q_auto:eco,dpr_1/rjqy7ghvjoiu4cd1xjbf

Using Portkey AI with Configs

librechat.yaml
    - name: "Portkey"
      apikey: "dummy"
      baseURL: ${PORTKEY_GATEWAY_URL}
      headers:
        x-portkey-api-key: "${PORTKEY_API_KEY}"
        x-portkey-config: "pc-libre-xxx"
      models:
        default: ["llama-3.2"]
        fetch: true
      titleConvo: true
      titleModel: "current_model"
      summarize: false
      summaryModel: "current_model"
      forcePrompt: false
      modelDisplayLabel: "Portkey:Llama"
      iconURL: https://images.crunchbase.com/image/upload/c_pad,f_auto,q_auto:eco,dpr_1/rjqy7ghvjoiu4cd1xjbf

image