-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
-
QQ截图20240823161258
I want to use the glm-4 model, but it doesn't work. The following prompt appears:
""
QQ截图20240823160905
Beta Was this translation helpful? Give feedback.
All reactions
Replies: 2 comments
-
Hi I think I you case you have set api url to what looks like a openai compatable endpoint.
My suggestion would be:
Set LLLM_NAME=openai
Then inside application/llm/openai.py
self.client = OpenAI( api_key=api_key, base_url="http://open.bigmodel.cn/api/paas/v4", )
Alse in your env variables set MODEL_NAME to glm-4
API_URL should point to docsgpt backend.
Configuration of LLM endpoint is separate
Hope this helps
Beta Was this translation helpful? Give feedback.
All reactions
0 replies
-
已经收到您的来件!!
Beta Was this translation helpful? Give feedback.
All reactions
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment