Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Minimal android app to make inference of llm models

Notifications You must be signed in to change notification settings

0xlibless/Chatty.cpp

Repository files navigation

Chatty.cpp

The simplest and fastest way to run a local LLM model on your Android device

Overview

Chatty.cpp is an open-source Android application that was born as a personal project, created with the idea of helping older adults who need an artificial intelligence model that doesn’t rely on a network to perform urgent queries.

Main Features

• Character / Role Creator

Customize and define different personalities for the model.

• Simple Interface

Clean and easy-to-use design suitable for anyone.

• Automatic LFM2-1.2B Model Download

The app automatically fetches the LFM2-1.2B model for you.

• Designed for Mid-Range Models

Optimized to run smoothly on mid-range Android hardware.

• 100% Offline

Everything works offline, except for the initial model download.

About

Minimal android app to make inference of llm models

Topics

Resources

Stars

Watchers

Forks

AltStyle によって変換されたページ (->オリジナル) /