# 📚 ReadMeBro
**ReadMeBro** is an AI-powered (and AI-optional) documentation assistant for Python codebases.
It automatically scans your repository, finds functions and classes, detects where they are used, and generates **clear, concise documentation**.
You can use it **with AI** (via Ollama + CodeLlama) or **without AI** for a plain text README.
---
## 🚀 Features
- **`@readmebro` Decorator** → Capture metadata of functions and classes.
- **Usage Mapping** → Find where each function/class is used in your codebase.
- **Two Documentation Modes**:
- **Without AI** → Generate standard documentation with `readmebro generate`
- **With AI** → Use `readmebro generate --llm ollama --model codellama:7b` for rich, contextual docs
- **JSON Registry** → Store raw scan data in `documentation/raw/`.
- **Function Graph** → Auto-generate a visual map of how functions connect.
- **Minimal Setup** → Works out of the box with or without AI.
---
## 📦 Installation
```bash
pip install readmebro
````
Or install directly from source:
```bash
git clone https://github.com/samartha-siddhartha/readmebro.git
cd readmebro
pip install -e .
```
---
## ⚡ Quick Start
```python
from readmebro.decorator import readmebro
@readmebro
def add_numbers(a, b):
"""Simple addition function"""
return a + b
```
1️⃣ **Run your Python code at least once**
This ensures the `@readmebro` decorator captures the function/class in the registry.
2️⃣ **Scan your repository for usage**
```bash
readmebro scan
```
3️⃣ **Generate documentation**
---
### 🔹 Generate Documentation Without AI
```bash
readmebro generate
```
This will create a basic `README_GENERATED.md` from captured metadata.
---
### 🔹 Generate AI-Powered Documentation (Recommended: CodeLlama)
First, install **Ollama** and pull the CodeLlama model:
```bash
ollama pull codellama:7b
```
Then run:
```bash
readmebro generate --llm ollama --model codellama:7b
```
> ⚠️ Note: ReadMeBro uses the **local Ollama HTTP API** via `requests` (default: `http://localhost:11434`).
> No external servers are contacted unless *you* configure Ollama otherwise.
---
## 📂 Output Structure
```
documentation/
raw/
readmebro_registry.json # Captured code metadata
readmebro_usage.json # Usage mapping
README_GENERATED.md # Generated documentation
function_graph.md # Mermaid function graph
function_graph.html # Interactive graph (pyvis)
```
---
## 🔗 Workflow Overview
```mermaid
graph TD
A[Run your Python code] --> B[readmebro scan]
B --> C[Usage & registry JSON]
C --> D[readmebro generate]
D --> E[README_GENERATED.md]
D --> F[Function Graphs]
F --> G[Mermaid + PyVis visualizations]
```
---
## 🖥 Requirements
* **Python 3.8+**
* *(Optional)* Ollama installed locally → [Install Ollama](https://ollama.ai/download)
* *(Optional)* Model downloaded for AI docs (recommended: CodeLlama 7B):
```bash
ollama pull codellama:7b
```
---
## 📌 Commands
```bash
readmebro scan
readmebro generate
readmebro generate --llm ollama --model codellama:7b
```
---
## 🤝 Contributing
We welcome contributions!
1. Fork the repo
2. Create a feature branch (`git checkout -b feature-name`)
3. Commit changes (`git commit -m "Added feature"`)
4. Push and create a Pull Request
---
## 📜 License
MIT License — Feel free to use and modify.
---
**Made with ❤️ + 🦙 by Samartha Siddhartha**
```
---
If you want, I can **add a workflow diagram** showing:
`Run code → readmebro scan → readmebro generate`
so that users can see the steps visually inside the README.
That would make it extra intuitive for first-time users.
```
Raw data
{
"_id": null,
"home_page": null,
"name": "readmebro",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "documentation, code-analysis, readme, AI, ollama, codellama",
"author": null,
"author_email": "Samartha Siddhartha <samartha.siddhartha@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/1b/77/1483696e84729626ba231244239a8ae08172d8170296e6f2bc5484b66d8b/readmebro-1.2.0.tar.gz",
"platform": null,
"description": "# \ud83d\udcda ReadMeBro\n\n**ReadMeBro** is an AI-powered (and AI-optional) documentation assistant for Python codebases. \nIt automatically scans your repository, finds functions and classes, detects where they are used, and generates **clear, concise documentation**. \nYou can use it **with AI** (via Ollama + CodeLlama) or **without AI** for a plain text README.\n\n---\n\n## \ud83d\ude80 Features\n\n- **`@readmebro` Decorator** \u2192 Capture metadata of functions and classes.\n- **Usage Mapping** \u2192 Find where each function/class is used in your codebase.\n- **Two Documentation Modes**:\n - **Without AI** \u2192 Generate standard documentation with `readmebro generate`\n - **With AI** \u2192 Use `readmebro generate --llm ollama --model codellama:7b` for rich, contextual docs\n- **JSON Registry** \u2192 Store raw scan data in `documentation/raw/`.\n- **Function Graph** \u2192 Auto-generate a visual map of how functions connect.\n- **Minimal Setup** \u2192 Works out of the box with or without AI.\n\n---\n\n## \ud83d\udce6 Installation\n\n```bash\npip install readmebro\n````\n\nOr install directly from source:\n\n```bash\ngit clone https://github.com/samartha-siddhartha/readmebro.git\ncd readmebro\npip install -e .\n```\n\n---\n\n## \u26a1 Quick Start\n\n```python\nfrom readmebro.decorator import readmebro\n\n@readmebro\ndef add_numbers(a, b):\n \"\"\"Simple addition function\"\"\"\n return a + b\n```\n\n1\ufe0f\u20e3 **Run your Python code at least once**\nThis ensures the `@readmebro` decorator captures the function/class in the registry.\n\n2\ufe0f\u20e3 **Scan your repository for usage**\n\n```bash\nreadmebro scan\n```\n\n3\ufe0f\u20e3 **Generate documentation**\n\n---\n\n### \ud83d\udd39 Generate Documentation Without AI\n\n```bash\nreadmebro generate\n```\n\nThis will create a basic `README_GENERATED.md` from captured metadata.\n\n---\n\n### \ud83d\udd39 Generate AI-Powered Documentation (Recommended: CodeLlama)\n\nFirst, install **Ollama** and pull the CodeLlama model:\n\n```bash\nollama pull codellama:7b\n```\n\nThen run:\n\n```bash\nreadmebro generate --llm ollama --model codellama:7b\n```\n\n> \u26a0\ufe0f Note: ReadMeBro uses the **local Ollama HTTP API** via `requests` (default: `http://localhost:11434`).\n> No external servers are contacted unless *you* configure Ollama otherwise.\n\n---\n\n## \ud83d\udcc2 Output Structure\n\n```\ndocumentation/\n raw/\n readmebro_registry.json # Captured code metadata\n readmebro_usage.json # Usage mapping\n README_GENERATED.md # Generated documentation\n function_graph.md # Mermaid function graph\n function_graph.html # Interactive graph (pyvis)\n```\n\n---\n\n## \ud83d\udd17 Workflow Overview\n\n```mermaid\ngraph TD\n A[Run your Python code] --> B[readmebro scan]\n B --> C[Usage & registry JSON]\n C --> D[readmebro generate]\n D --> E[README_GENERATED.md]\n D --> F[Function Graphs]\n F --> G[Mermaid + PyVis visualizations]\n```\n\n---\n\n## \ud83d\udda5 Requirements\n\n* **Python 3.8+**\n* *(Optional)* Ollama installed locally \u2192 [Install Ollama](https://ollama.ai/download)\n* *(Optional)* Model downloaded for AI docs (recommended: CodeLlama 7B):\n\n```bash\nollama pull codellama:7b\n```\n\n---\n\n## \ud83d\udccc Commands\n\n```bash\nreadmebro scan\nreadmebro generate\nreadmebro generate --llm ollama --model codellama:7b\n```\n\n---\n\n## \ud83e\udd1d Contributing\n\nWe welcome contributions!\n\n1. Fork the repo\n2. Create a feature branch (`git checkout -b feature-name`)\n3. Commit changes (`git commit -m \"Added feature\"`)\n4. Push and create a Pull Request\n\n---\n\n## \ud83d\udcdc License\n\nMIT License \u2014 Feel free to use and modify.\n\n---\n\n**Made with \u2764\ufe0f + \ud83e\udd99 by Samartha Siddhartha**\n\n```\n\n---\n\nIf you want, I can **add a workflow diagram** showing: \n`Run code \u2192 readmebro scan \u2192 readmebro generate` \nso that users can see the steps visually inside the README. \nThat would make it extra intuitive for first-time users.\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "AI-powered and AI-optional Python documentation generator",
"version": "1.2.0",
"project_urls": {
"Bug Tracker": "https://github.com/samartha-siddhartha/readmebro/issues",
"Homepage": "https://github.com/samartha-siddhartha/readmebro"
},
"split_keywords": [
"documentation",
" code-analysis",
" readme",
" ai",
" ollama",
" codellama"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "b9d5e74103ef13f6a2b895a21940aaa47b47f4a5a094051be3b4b55f4c63ce38",
"md5": "bc044e5bf74ff7078d39f8e883efdc99",
"sha256": "a75bf7df1e14230ab25db6fb1350ea9d19b25395cabc59fcfa3e23b21e8948ac"
},
"downloads": -1,
"filename": "readmebro-1.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "bc044e5bf74ff7078d39f8e883efdc99",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 8806,
"upload_time": "2025-08-31T16:45:38",
"upload_time_iso_8601": "2025-08-31T16:45:38.877262Z",
"url": "https://files.pythonhosted.org/packages/b9/d5/e74103ef13f6a2b895a21940aaa47b47f4a5a094051be3b4b55f4c63ce38/readmebro-1.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "1b771483696e84729626ba231244239a8ae08172d8170296e6f2bc5484b66d8b",
"md5": "29c1cd14de4af397934647ead50d0c4d",
"sha256": "b3e9c2776837f7fc2232522a7e09e8ab63c1f0297ebdd401061298540307b6f3"
},
"downloads": -1,
"filename": "readmebro-1.2.0.tar.gz",
"has_sig": false,
"md5_digest": "29c1cd14de4af397934647ead50d0c4d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 7797,
"upload_time": "2025-08-31T16:45:39",
"upload_time_iso_8601": "2025-08-31T16:45:39.943694Z",
"url": "https://files.pythonhosted.org/packages/1b/77/1483696e84729626ba231244239a8ae08172d8170296e6f2bc5484b66d8b/readmebro-1.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-31 16:45:39",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "samartha-siddhartha",
"github_project": "readmebro",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "readmebro"
}