- ollama-java-client - The Ollama Java Client library
- ollama-java-client-starter - The Ollama Java Client Spring Boot 3 Starter
- spring-boot-ollama-sample - A sample Spring Boot 3 application using the Ollama Java Client Spring Boot 3 Starter
The OllamaService interface provide the interaction with the ollama web service.
public interface OllamaService { CompletionResponse completion(CompletionRequest completionRequest); TagsResponse getTags(); ShowResponse show(ShowRequest showRequest); void copy(CopyRequest copyRequest); void delete(String modelName); void streamingCompletion(CompletionRequest completionRequest, StreamResponseProcessor<String> handler); EmbeddingResponse embed(EmbeddingRequest embeddingRequest); }
The OllamaServiceFactory class is responsible for creating
instances of the OllamaService. It provides builder methods
to create an instance of the service with the specified configuration.
public class OllamaServiceFactory { public static OllamaService create(OllamaProperties properties) { // ... } public static OllamaService create(OllamaProperties properties, Gson gson) { // ... } }
The StreamResponseProcessor interface provides methods to process streaming completion responses.
public interface StreamResponseProcessor<T> { void processStreamItem(T item); void processCompletion(T fullResponse); void processError(Throwable throwable); }
Just create an instance of the OllamaService with the factory and use it.
Have a look at here
or have a look at the spring-boot-ollama-sample project.
https://github.com/jmorganca/ollama/blob/main/docs/api.md
https://github.com/jmorganca/ollama/blob/main/docs/linux.md
$ curl https://ollama.ai/install.sh | sh
>>> Installing ollama to /usr/local/bin...
>>> Creating ollama user...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> NVIDIA GPU installed.# open http://localhost:11434/
# or via curl
$ curl http://localhost:11434/api/tags
$ ollama run mistral
To view logs of Ollama running as a startup service, run:
$ journalctl -u ollama
Remove the ollama service:
sudo systemctl stop ollama sudo systemctl disable ollama sudo rm /etc/systemd/system/ollama.service
Remove the ollama binary from your bin directory (either /usr/local/bin, /usr/bin, or /bin):
sudo rm $(which ollama)Remove the downloaded models and Ollama service user:
sudo rm -r /usr/share/ollama sudo userdel ollama