Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Thibault-Knobloch/codebase-intelligence

Folders and files

NameName
Last commit message
Last commit date

Latest commit

History

16 Commits

Repository files navigation

codebase-intelligence

🧩 Tibo – a powerful command-line tool designed to index your codebase, generate call graphs, and chunk code into a vector database. With tibo, you can query your codebase using natural language and retrieve contextually relevant files, functions, and code snippets effortlessly.

Tibo Workflow

Features

  • Codebase Indexing: Scans and organizes your project for easy querying.
  • Call Graph Generation: Maps relationships between functions and files.
  • Vector Database: Embeds code chunks for fast, intelligent retrieval.
  • Natural Language Queries: Ask questions about your code in plain English.
  • Context-Aware Results: Returns relevant files and snippets with added context from the call graph.

Installation

Get started with tibo by installing:

pip install tibo

Find the latest version and additional details on the PyPI project page.

Usage

Follow these steps to integrate tibo into your workflow:

  1. Configure the Tool - Set up tibo with your OpenAI API key:
tibo config

OPTIONAL: Configure a local LLM to use for on device ai processing:

tibo local

NOTE: need to provide model name and url, and ensure the local llm server is running on your device at that specified URL.

  1. Index Your Project - Navigate to your project directory and index your codebase:
cd /path/to/your/project
tibo index

Note: This creates a .tibo folder in your project root to store indexed data, call graphs, and vector embeddings.

  1. Query Your Codebase - Fetch relevant context by asking questions in natural language:
tibo fetch "my query to the codebase"

Results include the most relevant file names and code chunks. Full output is saved in .tibo/query_output/query_output.json.

  1. NEW Interact with Tibo Agent - chat with the ai agent to understnad the codebase better and get help with implementing new features:
tibo agent

NOTE: requires running tibo config and adding ANTHROPIC_API_KEY when prompted. The agent can use the tibo fetching tools if you have run 'tibo index' before. In the shell:

  • type 'exit' or 'quit' to quit the agent shell
  • type '#' followed by a command to execute a command directly in your terminal
  • type 'reset' to reset the conversation history

Extra tools:

  • agent can perform web searches (requires setting up OPENAI_API_KEY)
  • agent can get project structure details (requires running tibo index)
  • agent can read file contents when needed NOTE: Editing/creating/deleting files coming soon...

How It Works

Configuration: Link Tibo to your OpenAI API for LLM-powered enhancements.

Indexing: Processes codebase, builds call graph, chunks files, enhances with GPT-4o-mini, and stores vector embeddings locally.

Querying: Enhances your query with an LLM, matches it to the top relevant chunks, and supplements results with call graph context.

Requirements

Python 3.7+ An OpenAI API key (required for LLM functionality)

Contributing

We welcome contributions! Feel free to open issues or submit pull requests on our GitHub repository.

License

MIT License

About

🧩 Tibo – CLI tool for codebase indexing, embedding and natural language querying.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

AltStyle γ«γ‚ˆγ£γ¦ε€‰ζ›γ•γ‚ŒγŸγƒšγƒΌγ‚Έ (->γ‚ͺγƒͺγ‚ΈγƒŠγƒ«) /