-
Couldn't load subscription status.
- Fork 13.4k
llama.cpp usage
#7942
-
I was wondering how most people use llama.cpp. I mostly use server's OpenAI (semi)compatible endpoint for own projects.
How do you primarily use the llama.cpp?
main (cli) application running in the terminal
40%
server application with a included browser-based UI
20%
server application with own custom-built frontend
40%
server application with a third-party UI
0%
both the CLI and server applications in various configurations
0%
5 votes ·
Beta Was this translation helpful? Give feedback.
All reactions
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment