-
Notifications
You must be signed in to change notification settings - Fork 13.5k
-
Not sure if this has been discussed before, but I'm trying to understand how feasible this is. As such this would provide support for "free" for models if all the operations are already supported in ggml/llama.cpp.
From my understanding, everything finally boils down a compute graph with different operations and input/output tensors. If some how we are able take the compute graph and convert it automatically then it would be a very useful feature. Obviously there are other complexities with tokenizers, chat templates etc. but getting logits to match would be the logical first step
Beta Was this translation helpful? Give feedback.
All reactions
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment