Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings
#

cpu-only

Here are 9 public repositories matching this topic...

πŸ¦™ chat-o-llama: A lightweight, modern web interface for AI conversations with support for both Ollama and llama.cpp backends. Features persistent conversation management, real-time backend switching, intelligent context compression, and a clean responsive UI.

  • Updated Dec 10, 2025
  • Python

A high-performance Python library for extracting structured content from PDF documents with layout-aware text extraction. pdf_to_json preserves document structure including headings (H1-H6) and body text, outputting clean JSON format.

  • Updated Dec 16, 2025
  • Python

CPU-friendly experience-based reasoning framework combining meta-learning (MAML), state space models (SSM), and memory buffers for fast few-shot adaptation. Pure NumPy implementation for edge devices and low-compute environments.

  • Updated Oct 23, 2025
  • Python

Chat-O-Llama is a user-friendly web interface for managing conversations with Ollama, featuring persistent chat history. Easily set up and start your chat sessions with just a few commands. πŸ™πŸ’»

  • Updated Dec 31, 2025
  • HTML

Improve this page

Add a description, image, and links to the cpu-only topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the cpu-only topic, visit your repo's landing page and select "manage topics."

Learn more

AltStyle γ«γ‚ˆγ£γ¦ε€‰ζ›γ•γ‚ŒγŸγƒšγƒΌγ‚Έ (->γ‚ͺγƒͺγ‚ΈγƒŠγƒ«) /