-
Notifications
You must be signed in to change notification settings - Fork 1.4k
-
Hi, I am trying to connect this frontend with the oobabooga API. While using oobabooga I get good answers but when I use it over this frontend connected to that API I get weird answers.
The parameters such as temperature should be set correctly in the model definition as provided in the README. I wonder if we can also set there the mode (i.e. instruct in my case for codellama) and the instruction_template. At the moment I think that this could be the cause of the issue. Maybe you have another idea?
I also set the "preprompt".
All in all this is the simplified definition:
MODELS=[ { "name": "codellama-7b-instruct.Q5_K_M.gguf", "displayName": "MYAPP", "id": "text-generation-webui", "preprompt": "Here is my prompt", "parameters": { "temperature": 1.0, "top_p": 0.95, "repetition_penalty": 1.0, "top_k": 50, "truncate": 10000, "max_new_tokens": 10000, "stop": [] }, "endpoints": [{ "type" : "openai", "baseURL": "http://hostname:5000/v1" }] } ]
Beta Was this translation helpful? Give feedback.
All reactions
Actually, just found in this files the parameters that are sent to open api endpoint:
https://github.com/huggingface/chat-ui/blob/main/src/lib/server/endpoints/openai/endpointOai.ts
I just added into my installation additional parameters such as top_k, mode and instruction_template. I don't know whether it could interest others as well to enlarge this part of the code to include more parameters.
Replies: 1 comment
-
Actually, just found in this files the parameters that are sent to open api endpoint:
https://github.com/huggingface/chat-ui/blob/main/src/lib/server/endpoints/openai/endpointOai.ts
I just added into my installation additional parameters such as top_k, mode and instruction_template. I don't know whether it could interest others as well to enlarge this part of the code to include more parameters.
Beta Was this translation helpful? Give feedback.