Ollama
Ollama API key: Required but ignored - Ollama OpenAI Compatibility
Notes:
- Known: icon provided.
- Download models with ollama run command. See Ollama Library
- It’s recommend to use the value “current_model” for the
titleModel
to avoid loading more than 1 model per conversation.- Doing so will dynamically use the current conversation model for the title generation.
- The example includes a top 5 popular model list from the Ollama Library, which was last updated on March 1, 2024, for your convenience.
librechat.yaml
custom:
- name: "Ollama"
apiKey: "ollama"
# use 'host.docker.internal' instead of localhost if running LibreChat in a docker container
baseURL: "http://localhost:11434/v1/chat/completions"
models:
default: [
"llama2",
"mistral",
"codellama",
"dolphin-mixtral",
"mistral-openorca"
]
# fetching list of models is supported but the `name` field must start
# with `ollama` (case-insensitive), as it does in this example.
fetch: true
titleConvo: true
titleModel: "current_model"
summarize: false
summaryModel: "current_model"
forcePrompt: false
modelDisplayLabel: "Ollama"
🔥
Ollama -> llama3