Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit 252d9e2

Browse files
Update README.md with Ollama notes
1 parent dfb6378 commit 252d9e2

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

‎README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -125,7 +125,7 @@ Since the local app uses OpenAI models, you should first deploy it for the optim
125125
```
126126
127127
3. To use OpenAI.com OpenAI, set `OPENAI_CHAT_HOST` and `OPENAI_EMBED_HOST` to "openai". Then fill in the value for `OPENAICOM_KEY`.
128-
4. To use Ollama, set `OPENAI_CHAT_HOST` to "ollama". Then update the values for `OLLAMA_ENDPOINT` and `OLLAMA_CHAT_MODEL` to match your local setup and model. Note that you won't be able to use functioncalling with an Ollama model, and you'll need to either turn off vector search or use either "azure" or "openai" for `OPENAI_EMBED_HOST`.
128+
4. To use Ollama, set `OPENAI_CHAT_HOST` to "ollama". Then update the values for `OLLAMA_ENDPOINT` and `OLLAMA_CHAT_MODEL` to match your local setup and model. Note that most Ollama models are not compatible with the "Advanced flow", due to the need for function calling support, so you'll need to disable that in _Developer Settings_ in the UI. In addition, the database rows are embedded using the default OpenAI embedding model, so you can't search them using an Ollama embedding model. You can either choose to set `OPENAI_EMBED_HOST` to "azure" or "openai", or turn off vector search in _Developer Settings_.
129129
130130
### Running the frontend and backend
131131

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /