Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

oalles/ollama-java

Repository files navigation

OLLAMA Java Client

Modules

Model Description

OllamaService

The OllamaService interface provide the interaction with the ollama web service.

public interface OllamaService {
 CompletionResponse completion(CompletionRequest completionRequest);
 TagsResponse getTags();
 ShowResponse show(ShowRequest showRequest);
 void copy(CopyRequest copyRequest);
 void delete(String modelName);
 void streamingCompletion(CompletionRequest completionRequest, StreamResponseProcessor<String> handler);
 EmbeddingResponse embed(EmbeddingRequest embeddingRequest);
}

OllamaServiceFactory

The OllamaServiceFactory class is responsible for creating instances of the OllamaService. It provides builder methods to create an instance of the service with the specified configuration.

public class OllamaServiceFactory {
 public static OllamaService create(OllamaProperties properties) { // ...
 }
 public static OllamaService create(OllamaProperties properties, Gson gson) { // ...
 }
}

StreamResponseProcessor

The StreamResponseProcessor interface provides methods to process streaming completion responses.

public interface StreamResponseProcessor<T> {
 void processStreamItem(T item);
 void processCompletion(T fullResponse);
 void processError(Throwable throwable);
}

How to use

Just create an instance of the OllamaService with the factory and use it.

Have a look at here

or have a look at the spring-boot-ollama-sample project.

Useful Resources

API DOC

https://github.com/jmorganca/ollama/blob/main/docs/api.md

Linux install

https://github.com/jmorganca/ollama/blob/main/docs/linux.md

$ curl https://ollama.ai/install.sh | sh
>>> Installing ollama to /usr/local/bin...
>>> Creating ollama user...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> NVIDIA GPU installed.

Test URL

# open http://localhost:11434/ 
# or via curl 
$ curl http://localhost:11434/api/tags

Instal Mistral 7B model

$ ollama run mistral

Viewing logs

To view logs of Ollama running as a startup service, run:

$ journalctl -u ollama

Uninstall

Remove the ollama service:

sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service

Remove the ollama binary from your bin directory (either /usr/local/bin, /usr/bin, or /bin):

sudo rm $(which ollama)

Remove the downloaded models and Ollama service user:

sudo rm -r /usr/share/ollama
sudo userdel ollama

Releases

No releases published

Packages

No packages published

Contributors 2

Languages

AltStyle によって変換されたページ (->オリジナル) /