Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Questions about function calls with VLLM and Ollama #988

Unanswered
zzllkk2003 asked this question in Q&A
Discussion options

  1. Does Spring AI currently support VLLM ? If not, when is the expected timeline for supporting it?
  2. Does Spring AI facilitate function calls from Ollama? Is this feature anticipated to be included in the M2 version?
You must be logged in to vote

Replies: 1 comment

Comment options

About question 2, work is in progress to add function calling capability in the Ollama integration. You can follow the status here: #720

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet

AltStyle によって変換されたページ (->オリジナル) /