Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

ashmaster/vector-notes

Folders and files

NameName
Last commit message
Last commit date

Latest commit

History

4 Commits

Repository files navigation

Vector Notes

A Go-based system for syncing your notes to a vector database and querying them with AI assistance. The system consists of two main components:

  • vector-sync: Monitors your notes directory and syncs changes to Pinecone vector database
  • note-gpt: AI-powered query interface for your vectorized notes using Gemini

Features

  • πŸ”„ Real-time sync: Automatically detects file changes and updates the vector database
  • πŸ” Semantic search: Query your notes using natural language
  • πŸ€– AI assistance: Get contextual answers from your notes using Google's Gemini
  • πŸ“ File watching: Monitors .md files in your notes directory
  • 🌲 Tree structure: Maintains directory structure for efficient syncing
  • πŸ’Ύ Incremental updates: Only processes changed files

Prerequisites

  • Go 1.21 or higher
  • Pinecone account and API key
  • Google AI Studio API key for Gemini
  • Local embedding server (Ollama recommended)

Setup

1. Clone the Repository

git clone https://github.com/ashmaster/vector-notes.git

2. Install Dependencies

# Install vector-sync dependencies
cd vector-sync
go mod tidy
# Install note-gpt dependencies
cd ../note-gpt
go mod tidy

3. Setup Pinecone

  1. Create a Pinecone account
  2. Create a new index with the following specifications:
    • Dimension: 768 (for nomic-embed-text model)
    • Metric: Cosine
    • Index Name: joyful-elm (or update the code to match your index name)

4. Setup Local Embedding Server

Install and run Ollama with the nomic-embed-text model:

# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Pull the embedding model
ollama pull nomic-embed-text
# Start Ollama server (runs on localhost:11434 by default)
ollama serve

5. Configure Environment Variables

Create .env files in both vector-sync/ and note-gpt/ directories:

vector-sync/.env:

PINECONE_API_KEY=your_pinecone_api_key
PINECONE_HOST=https://your-index-host.pinecone.io
NOTES_DIR=/path/to/your/notes/directory
EMBEDDING_URL=http://localhost:11434/api/embed

note-gpt/.env:

PINECONE_API_KEY=your_pinecone_api_key
PINECONE_HOST=https://your-index-host.pinecone.io
NOTES_DIR=/path/to/your/notes/directory
EMBEDDING_URL=http://localhost:11434/api/embed
GEMINI_API_KEY=your_gemini_api_key

Replace the placeholder values:

  • your_pinecone_api_key: Your Pinecone API key
  • https://your-index-host.pinecone.io: Your Pinecone index host URL
  • /path/to/your/notes/directory: Absolute path to your notes directory
  • your_gemini_api_key: Your Google AI Studio API key

Usage

Running Vector Sync

The vector-sync service monitors your notes directory and keeps the vector database synchronized:

cd vector-sync
go run main.go

This will:

  • Build an initial tree structure of your notes
  • Start watching for file changes
  • Sync changes to Pinecone every 5 seconds
  • Log all sync operations

Running Note GPT

The note-gpt service provides an interactive query interface:

cd note-gpt
go run cmd/main.go

This starts an interactive session where you can:

  • Ask questions about your notes
  • Get AI-generated answers with file citations
  • Maintain conversation context across queries

Example interaction:

> What are the cake ingredients?
Based on your notes from "Cake ingredients.md", you'll need flour, sugar, eggs, butter, and baking powder.
> How much flour?
According to the same file, you need 2 cups of all-purpose flour.

Using VS Code

The repository includes VS Code launch configurations. You can:

  1. Open the workspace in VS Code
  2. Go to Run and Debug (Ctrl+Shift+D)
  3. Select either "Launch Vector Sync" or "Launch note-gpt"
  4. Press F5 to start debugging

Architecture

Vector Sync Flow

  1. File Watcher: Monitors .md files using fsnotify
  2. Tree Structure: Maintains file hierarchy in Tree
  3. Diff Detection: Compares client and server trees to find changes
  4. Vector Upsert: Embeds content and stores in Pinecone via Vector

Note GPT Flow

  1. Query Processing: Takes user input and vectorizes it
  2. Semantic Search: Finds top 2 relevant notes from Pinecone
  3. Context Building: Reads file contents and builds LLM context
  4. AI Response: Generates response using Gemini with conversation history

File Structure

β”œβ”€β”€ vector-sync/ # Sync service
β”‚ β”œβ”€β”€ internal/
β”‚ β”‚ β”œβ”€β”€ config.go # Configuration management
β”‚ β”‚ β”œβ”€β”€ sync.go # Main synchronization logic
β”‚ β”‚ β”œβ”€β”€ tree.go # File tree operations
β”‚ β”‚ β”œβ”€β”€ node.go # Tree node structure
β”‚ β”‚ β”œβ”€β”€ watcher.go # File system watcher
β”‚ β”‚ └── utils.go # Utility functions
β”‚ β”œβ”€β”€ pkg/
β”‚ β”‚ β”œβ”€β”€ vector.go # Pinecone integration
β”‚ β”‚ └── embedding.go # Embedding API client
β”‚ └── main.go # Entry point
β”œβ”€β”€ note-gpt/ # Query service
β”‚ β”œβ”€β”€ cmd/
β”‚ β”‚ └── main.go # CLI interface
β”‚ β”œβ”€β”€ internal/
β”‚ β”‚ β”œβ”€β”€ app.go # Main application logic
β”‚ β”‚ └── config.go # Configuration management
β”‚ └── pkg/
β”‚ β”œβ”€β”€ vector.go # Pinecone query operations
β”‚ β”œβ”€β”€ embedding.go # Embedding API client
β”‚ └── gemini.go # Gemini AI client

Logging

Both services provide detailed logging:

  • File operations and sync status
  • Vector database operations
  • API calls and responses
  • Error details with context

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

License

This project is licensed under the MIT License.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

AltStyle γ«γ‚ˆγ£γ¦ε€‰ζ›γ•γ‚ŒγŸγƒšγƒΌγ‚Έ (->γ‚ͺγƒͺγ‚ΈγƒŠγƒ«) /