0

I'm using a locally hosted model(llama3.2) with Ollama and trying to replicate functionality similar to bind_tools(to create and run the tools with LLM ) for tool calling. This is my model service

from src.config.settings import settings
from langchain.prompts import PromptTemplate
from langchain.memory import ConversationBufferWindowMemory
from langchain_ollama import OllamaLLM
class LLMService:
"""
Service for managing LLM interactions and conversation flow.
This service will handle:
- Ollama model initialization and configuration
- Prompt template management for finance domain
- Response generation with context
- Conversation memory management
- RAG prompt engineering for finance queries
"""
def __init__(self):
 self.llm = OllamaLLM(
 model=settings.OLLAMA_MODEL,
 base_url=settings.OLLAMA_BASE_URL,
 stream=True
 )
 self.memory = ConversationBufferWindowMemory(
 memory_key="chat_history", return_messages=True, k=10
 )
 self.prompt_template = PromptTemplate.from_template("""
 You are a helpful assistant for {firm_name} accounting website. Use only the provided text files knowledge base to answer user questions accurately.
 
 Answer requirements:
 1. Quote exact figures/dates when available
 2. Reference relevant forms/regulations where applicable
 3. Keep answers under 3 sentences unless technical details require more
 4. Add the anchor tag with attribute target="_blank" if any link is provided
 5. Only use the link which is provided in the knowledge text file not other then else.
 
 
 If the question is not related to either accounting or {firm_name} website, reply:
 "Please ask a question related to accounting or {firm_name} website."
 Context: {context}
 Question: {question}
 Concise accounting answer:
 """)
 
def get_llm(self):
 return self.llm
def get_memory(self):
 return self.memory
def get_prompt(self, firm_name: str = "GL"):
 return self.prompt_template.partial(firm_name=firm_name)
llm_service = LLMService()

Since bind_tools is not available in the Ollama model llama3.2, I created a custom router like this:

def router_node(state):
input_text = state['messages'][-1]['content']
if 'account' in input_text.lower():
 return 'chatbot_node'
else:
 return 'other_node'

This approach works for simple keyword matching, but it doesn’t really understand the full context or intent behind a user’s input. I’m looking for a better, more context-aware way to route requests to different tools or nodes.

What I'm looking for:

  • A method to semantically analyze the input and decide which tool to invoke.
  • A way to achieve this using Ollama (or any local model) since bind_tools is not available.

Has anyone faced this issue or found an effective way to do context-based tool routing with local LLMs?

asked Jun 25, 2025 at 8:10
5
  • I am working with the Langgraph framework. Commented Jun 25, 2025 at 8:21
  • maybe you should check source code of bind_tools and recreate it (or maybe even copy all code) Commented Jun 25, 2025 at 12:28
  • Thank you for your suggestion; I will try it. Commented Jun 26, 2025 at 13:30
  • You can modify the initial context provided to your controller to also include a list of tools it has available and also instruct it to respond in a speficic format when it wants to invoke any of those tools, and then you can use your router node to invoke the tools as needed. It is not an elegant solution but it should work Commented Jul 15, 2025 at 8:39
  • Thanks for your comment, @KaranShishoo. I’ve found a solution! It turns out that OllamaLLM doesn’t support the bind_tools method, but ChatOllama does. Both are from the same langchain_ollama library, so you can import it like this: from langchain_ollama import ChatOllama I'm currently integrating this into my application and will post a complete answer once it's finalized. Commented Jul 16, 2025 at 9:09

0

Know someone who can answer? Share a link to this question via email, Twitter, or Facebook.

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.