- 
  Notifications
 
You must be signed in to change notification settings  - Fork 2k
 
Questions about function calls with VLLM and Ollama #988
 
 Unanswered
 
 
 
 
 zzllkk2003
 
 
 
 asked this question in
 Q&A
 
 -
- Does Spring AI currently support VLLM ? If not, when is the expected timeline for supporting it?
 - Does Spring AI facilitate function calls from Ollama? Is this feature anticipated to be included in the M2 version?
 
Beta Was this translation helpful? Give feedback.
All reactions
Replies: 1 comment
-
About question 2, work is in progress to add function calling capability in the Ollama integration. You can follow the status here: #720
Beta Was this translation helpful? Give feedback.
All reactions
 
 0 replies
 
 
 
 
 Sign up for free
 to join this conversation on GitHub.
 Already have an account?
 Sign in to comment