PyPI version License: MIT Downloads LinkedIn
structquery is a lightweight Python package that converts unstructured text queries into highly structured, search‐optimized outputs. By leveraging the llmatch‐messages utility and a default LLM7 model, the package extracts key elements from user input (e.g., location, difficulty, length) and returns them in a format that is ready for downstream search engines or information‐retrieval pipelines.
- One‐line transformation from free‐form text to a list of structured fields.
- Built‐in LLM7 support (
ChatLLM7fromlangchain_llm7) with automatic API‐key handling. - Pluggable LLMs – you can swap in any LangChain‐compatible chat model (OpenAI, Anthropic, Google, etc.).
- Regex‐driven output validation via
llmatchto guarantee format correctness. - Zero‐config defaults – works out‐of‐the‐box for common use‐cases.
pip install structquery
from structquery import structquery # Simple call – uses the default ChatLLM7 model response = structquery( user_input="best hiking trails in Colorado" ) print(response) # → List of extracted structured strings
| Name | Type | Description |
|---|---|---|
user_input |
str |
The raw user query or description you want to structure. |
llm |
Optional[BaseChatModel] |
Your own LangChain chat model. If omitted, the package creates a ChatLLM7 instance automatically. |
api_key |
Optional[str] |
LLM7 API key. If omitted, the package reads LLM7_API_KEY from the environment; otherwise it falls back to the placeholder "None" (which triggers an error from the LLM service). |
You can provide any LangChain‐compatible chat model that inherits from BaseChatModel. Below are examples for the most popular providers.
from langchain_openai import ChatOpenAI from structquery import structquery llm = ChatOpenAI(model="gpt-4o-mini") response = structquery( user_input="latest smartphones under 500ドル", llm=llm )
from langchain_anthropic import ChatAnthropic from structquery import structquery llm = ChatAnthropic(model_name="claude-3-haiku-20240307") response = structquery( user_input="affordable vegan meal plan", llm=llm )
from langchain_google_genai import ChatGoogleGenerativeAI from structquery import structquery llm = ChatGoogleGenerativeAI(model="gemini-1.5-flash") response = structquery( user_input="top-rated bike touring routes in Europe", llm=llm )
- The free tier of LLM7 supplies generous rate limits that satisfy most experimentation and production needs.
- To obtain a free API key, register at https://token.llm7.io/ .
- You can provide the key in three ways:
- Environment variable:
export LLM7_API_KEY=your_key - Direct argument:
structquery(user_input, api_key="your_key") - Pass a pre‐configured
ChatLLM7instance via thellmargument.
- Environment variable:
- Bug reports / feature requests: https://github.com/chigwell/structquery/issues
- Contributions are welcome! Fork the repository, make your changes, and submit a pull request.
Distributed under the MIT License. See the LICENSE file for details.
Eugene Evstafev – hi@euegne.plus
GitHub: https://github.com/chigwell