# GitLlama 🦙🤖
AI-powered git automation tool with deep project understanding. GitLlama v0.5.0 uses hierarchical AI analysis and intelligent single-word decision-making to clone, analyze, optimize, commit, and push your code.
## 🌟 Key Features
- **🧠 Deep Project Analysis**: Hierarchical summarization system that understands entire codebases
- **🎯 Single-Word Decision System**: AI makes deterministic decisions with fuzzy matching for reliability
- **🌿 Intelligent Branch Selection**: AI analyzes existing branches and decides whether to reuse or create new ones
- **📝 TODO.md Integration**: Detects and follows project owner guidance from TODO.md files
- **📊 Smart File Operations**: AI selects up to 5 files to create, modify, or delete intelligently
- **🔄 Guided Questions**: AI asks strategic questions throughout analysis for better understanding
- **📈 Synthesis & Recommendations**: Provides actionable next steps and development priorities
- **📝 Detailed Decision Tracking**: See every AI decision with confidence scores and reasoning
## 🚀 Installation
```bash
pip install gitllama
```
## 📋 Prerequisites
GitLlama requires Ollama for AI-powered features:
```bash
# Install Ollama (if not already installed)
curl -fsSL https://ollama.com/install.sh | sh
# Start Ollama server
ollama serve
# Pull a recommended model
ollama pull gemma3:4b
```
## 💻 Usage
### Basic usage (recommended):
```bash
gitllama https://github.com/user/repo.git
```
### With custom model:
```bash
gitllama https://github.com/user/repo.git --model llama3:8b
```
### With specific branch (AI handles all other decisions):
```bash
gitllama https://github.com/user/repo.git --branch feature/my-improvement
```
### Verbose mode (see all AI decisions):
```bash
gitllama https://github.com/user/repo.git --verbose
```
## 🔬 How It Works
GitLlama uses a sophisticated multi-step process to understand and improve repositories:
### 1. **Deep Repository Analysis** 🔍
- **Step 1: Data Gathering** - Scans all text files, configs, and documentation
- **Step 2: Smart Chunking** - Groups files to maximize AI context window usage
- **Step 3: Parallel Analysis** - Each chunk analyzed independently for scalability
- **Step 4: Hierarchical Merging** - Combines summaries using merge-sort approach
- **Step 5: Result Formatting** - Creates structured insights about the project
### 2. **Intelligent Workflow** 🤖
1. **Clones the repository**
2. **AI explores the project** - Deep multi-level analysis across all branches
3. **AI analyzes existing branches** - Evaluates reuse potential and compatibility
4. **AI decides on branch strategy** - Smart selection between reusing existing or creating new
5. **AI makes intelligent changes** - Based on comprehensive project understanding
6. **AI generates commit message** - Follows conventional commit format
7. **Pushes to remote**
### Example Analysis Output:
```
Starting hierarchical repository analysis
============================================================
STEP 1: DATA GATHERING
Found 45 files with 12500 total tokens
STEP 2: CHUNKING
Created 5 chunks for analysis
Chunk 1: 12 files, 2500 tokens
Chunk 2: 10 files, 2800 tokens
Chunk 3: 8 files, 2200 tokens
Chunk 4: 9 files, 2600 tokens
Chunk 5: 6 files, 2400 tokens
STEP 3: CHUNK ANALYSIS
Analyzing 5 chunks
Processing chunk 1/5...
Processing chunk 2/5...
...
STEP 4: HIERARCHICAL MERGING
Starting hierarchical merge of 5 summaries
Level 1: Merging 5 summaries (1800 tokens)
STEP 5: FORMAT RESULTS
Formatting final results
============================================================
Repository analysis complete!
```
### 3. **Intelligent Branch Selection** 🌿
GitLlama now features sophisticated branch analysis and selection:
```
Starting intelligent branch selection process
============================================================
STEP 1: ANALYZE EXISTING BRANCHES
Analyzing purposes of 3 branches
Branch 'feature/auth-system': Production-ready authentication system
Branch 'wip-database': Work-in-progress database optimization
Branch 'docs/api': API documentation updates
STEP 2: EVALUATE REUSE POTENTIAL
Evaluating reuse potential for existing branches
wip-database: score=45, reasons=work-in-progress branch, matching project type
feature/auth-system: score=35, reasons=feature branch, matching technologies
STEP 3: MAKE BRANCH DECISION
Making branch selection decision
🤖 AI: Deciding branch selection strategy with 2 candidates
Decision: REUSE - High compatibility with existing WIP branch
STEP 4: GENERATE/SELECT BRANCH NAME
Finalizing branch selection
Selected existing branch: wip-database
============================================================
Branch selection complete: wip-database
```
#### Branch Selection Features:
- **🔍 Multi-branch Analysis**: Examines all branches in the repository
- **🎯 Smart Scoring**: Evaluates compatibility based on project type, technologies, and purpose
- **🔄 Reuse Preference**: Strongly favors reusing existing branches (80% bias)
- **🏗️ Branch Classification**: Identifies feature, fix, docs, and WIP branches
- **⚙️ Intelligent Fallback**: Creates new branches with meaningful names when needed
## 🐍 Python API
```python
from gitllama import GitAutomator, AICoordinator
# With AI - Full intelligent automation
ai = AICoordinator(model="gemma3:4b")
with GitAutomator(ai_coordinator=ai) as automator:
results = automator.run_full_workflow(
git_url="https://github.com/user/repo.git"
)
print(f"Success: {results['success']}")
print(f"Branch created: {results['branch']}")
print(f"Files modified: {results['modified_files']}")
# Access detailed AI analysis
if 'ai_analysis' in results:
analysis = results['ai_analysis']
print(f"Project Type: {analysis['project_type']}")
print(f"Technologies: {', '.join(analysis['technologies'])}")
print(f"Code Quality: {analysis['quality']}")
print(f"Architecture: {analysis['architecture']}")
# Without AI - Simple automation
with GitAutomator() as automator:
results = automator.run_full_workflow(
git_url="https://github.com/user/repo.git",
branch_name="my-branch",
commit_message="My changes"
)
```
## 🏗️ Architecture
GitLlama is built with a modular architecture for easy extension:
```
gitllama/
├── cli.py # Command-line interface
├── git_operations.py # Git automation logic
├── ai_coordinator.py # AI workflow coordination
├── project_analyzer.py # Hierarchical project analysis
├── branch_analyzer.py # Intelligent branch selection (NEW!)
├── config.py # Configuration and logging setup
└── ollama_client.py # Ollama API integration
```
### Key Components:
- **ProjectAnalyzer**: Handles the 5-step hierarchical analysis process
- **BranchAnalyzer**: Intelligent branch selection with 4-step decision pipeline (NEW!)
- **AICoordinator**: Orchestrates AI decisions throughout the workflow
- **GitAutomator**: Manages git operations with optional AI integration
- **OllamaClient**: Interfaces with local Ollama models
## 🤖 AI Models
The tool works with any Ollama model. Recommended models:
- `gemma3:4b` - Fast and efficient (default)
- `llama3.2:1b` - Ultra-fast for simple tasks
- `codellama:7b` - Optimized for code understanding
- `mistral:7b` - Good general purpose
- `gemma2:2b` - Very fast, good for simple tasks
### Context Window Sizes:
- Small models (1-3B): ~2-4K tokens
- Medium models (7B): ~4-8K tokens
- Large models (13B+): ~8-16K tokens
## 🎯 What Gets Analyzed
GitLlama intelligently analyzes:
- Source code files (Python, JavaScript, Java, Go, Rust, etc.)
- Configuration files (JSON, YAML, TOML, etc.)
- Documentation (Markdown, README, LICENSE)
- Build files (Dockerfile, Makefile, package.json)
- Scripts (Shell, Batch, PowerShell)
- Web assets (HTML, CSS, XML)
## 📊 Analysis Results
The AI provides multi-level insights:
```json
{
"project_type": "web-application",
"technologies": ["Python", "FastAPI", "PostgreSQL", "React"],
"state": "Production-ready with comprehensive test coverage",
"architecture": "Microservices with REST API",
"quality": "High - follows best practices",
"patterns": ["MVC", "Repository Pattern", "Dependency Injection"],
"analysis_metadata": {
"total_files": 156,
"total_tokens": 45000,
"chunks_created": 12,
"context_window": 4096,
"model": "gemma3:4b"
}
}
```
## ⚙️ Configuration
```bash
# Use a different Ollama server
gitllama https://github.com/user/repo.git --ollama-url http://remote-server:11434
# Use a specific model with more context
gitllama https://github.com/user/repo.git --model codellama:7b
# Verbose output for debugging
gitllama https://github.com/user/repo.git --verbose
```
## 🔧 Extending GitLlama
The modular design makes it easy to add new analysis steps:
```python
# In project_analyzer.py, each step is clearly separated:
def _step1_gather_repository_data(self, repo_path):
"""STEP 1: Gather all repository data"""
# Add git history analysis here
# Add dependency scanning here
# Add security checks here
def _step2_create_chunks(self, files):
"""STEP 2: Create smart chunks"""
# Add semantic grouping here
# Add priority-based chunking here
def _step3_analyze_chunks(self, chunks):
"""STEP 3: Analyze each chunk"""
# Add code quality metrics here
# Add security scanning here
# Add performance analysis here
```
## 📈 Performance
- **Small repos (<100 files)**: ~30 seconds
- **Medium repos (100-500 files)**: ~1-2 minutes
- **Large repos (500+ files)**: ~2-5 minutes
*Times vary based on model size and system performance*
## 🛠️ Development
```bash
git clone https://github.com/your-org/gitllama.git
cd gitllama
pip install -e ".[dev]"
# Run tests
pytest
# Check code quality
make lint
make format
make type-check
```
## 🐛 Troubleshooting
### Ollama not available?
```bash
# Check if Ollama is running
curl http://localhost:11434/api/tags
# Start Ollama
ollama serve
```
### Context window too small?
```bash
# Use a model with larger context
gitllama repo.git --model mistral:7b
```
### Analysis taking too long?
```bash
# Use a smaller, faster model
gitllama repo.git --model llama3.2:1b
```
## 📝 License
GPL v3 - see LICENSE file
## 🤝 Contributing
Contributions are welcome! The modular architecture makes it easy to add:
- New analysis steps
- Additional AI models support
- More file type handlers
- Enhanced decision strategies
## 🚀 Future Enhancements
- [ ] Git history analysis
- [ ] Dependency vulnerability scanning
- [ ] Parallel chunk processing
- [ ] Code quality metrics
- [ ] Security analysis
- [ ] Test coverage assessment
- [ ] README generation
- [ ] Automatic PR descriptions
- [ ] Multi-language documentation
---
**Note**: GitLlama requires git credentials configured for pushing to repositories. Ensure you have proper access rights to the repositories you're modifying.
Raw data
{
"_id": null,
"home_page": null,
"name": "gitllama",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "Your Name <your.email@example.com>",
"keywords": "ai, automation, devops, git, llm, ollama, workflow",
"author": null,
"author_email": "Your Name <your.email@example.com>",
"download_url": "https://files.pythonhosted.org/packages/2f/9b/d4c491487dc501d6f14116d677e55e5ed601541628481df79a7376e52e03/gitllama-0.5.0.tar.gz",
"platform": null,
"description": "# GitLlama \ud83e\udd99\ud83e\udd16\n\nAI-powered git automation tool with deep project understanding. GitLlama v0.5.0 uses hierarchical AI analysis and intelligent single-word decision-making to clone, analyze, optimize, commit, and push your code.\n\n## \ud83c\udf1f Key Features\n\n- **\ud83e\udde0 Deep Project Analysis**: Hierarchical summarization system that understands entire codebases\n- **\ud83c\udfaf Single-Word Decision System**: AI makes deterministic decisions with fuzzy matching for reliability\n- **\ud83c\udf3f Intelligent Branch Selection**: AI analyzes existing branches and decides whether to reuse or create new ones\n- **\ud83d\udcdd TODO.md Integration**: Detects and follows project owner guidance from TODO.md files\n- **\ud83d\udcca Smart File Operations**: AI selects up to 5 files to create, modify, or delete intelligently\n- **\ud83d\udd04 Guided Questions**: AI asks strategic questions throughout analysis for better understanding\n- **\ud83d\udcc8 Synthesis & Recommendations**: Provides actionable next steps and development priorities\n- **\ud83d\udcdd Detailed Decision Tracking**: See every AI decision with confidence scores and reasoning\n\n## \ud83d\ude80 Installation\n\n```bash\npip install gitllama\n```\n\n## \ud83d\udccb Prerequisites\n\nGitLlama requires Ollama for AI-powered features:\n\n```bash\n# Install Ollama (if not already installed)\ncurl -fsSL https://ollama.com/install.sh | sh\n\n# Start Ollama server\nollama serve\n\n# Pull a recommended model\nollama pull gemma3:4b\n```\n\n## \ud83d\udcbb Usage\n\n### Basic usage (recommended):\n\n```bash\ngitllama https://github.com/user/repo.git\n```\n\n### With custom model:\n\n```bash\ngitllama https://github.com/user/repo.git --model llama3:8b\n```\n\n### With specific branch (AI handles all other decisions):\n\n```bash\ngitllama https://github.com/user/repo.git --branch feature/my-improvement\n```\n\n### Verbose mode (see all AI decisions):\n\n```bash\ngitllama https://github.com/user/repo.git --verbose\n```\n\n## \ud83d\udd2c How It Works\n\nGitLlama uses a sophisticated multi-step process to understand and improve repositories:\n\n### 1. **Deep Repository Analysis** \ud83d\udd0d\n - **Step 1: Data Gathering** - Scans all text files, configs, and documentation\n - **Step 2: Smart Chunking** - Groups files to maximize AI context window usage\n - **Step 3: Parallel Analysis** - Each chunk analyzed independently for scalability\n - **Step 4: Hierarchical Merging** - Combines summaries using merge-sort approach\n - **Step 5: Result Formatting** - Creates structured insights about the project\n\n### 2. **Intelligent Workflow** \ud83e\udd16\n 1. **Clones the repository**\n 2. **AI explores the project** - Deep multi-level analysis across all branches\n 3. **AI analyzes existing branches** - Evaluates reuse potential and compatibility\n 4. **AI decides on branch strategy** - Smart selection between reusing existing or creating new\n 5. **AI makes intelligent changes** - Based on comprehensive project understanding\n 6. **AI generates commit message** - Follows conventional commit format\n 7. **Pushes to remote**\n\n### Example Analysis Output:\n```\nStarting hierarchical repository analysis\n============================================================\nSTEP 1: DATA GATHERING\n Found 45 files with 12500 total tokens\nSTEP 2: CHUNKING\n Created 5 chunks for analysis\n Chunk 1: 12 files, 2500 tokens\n Chunk 2: 10 files, 2800 tokens\n Chunk 3: 8 files, 2200 tokens\n Chunk 4: 9 files, 2600 tokens\n Chunk 5: 6 files, 2400 tokens\nSTEP 3: CHUNK ANALYSIS\n Analyzing 5 chunks\n Processing chunk 1/5...\n Processing chunk 2/5...\n ...\nSTEP 4: HIERARCHICAL MERGING\n Starting hierarchical merge of 5 summaries\n Level 1: Merging 5 summaries (1800 tokens)\nSTEP 5: FORMAT RESULTS\n Formatting final results\n============================================================\nRepository analysis complete!\n```\n\n### 3. **Intelligent Branch Selection** \ud83c\udf3f\n\nGitLlama now features sophisticated branch analysis and selection:\n\n```\nStarting intelligent branch selection process\n============================================================\nSTEP 1: ANALYZE EXISTING BRANCHES\n Analyzing purposes of 3 branches\n Branch 'feature/auth-system': Production-ready authentication system\n Branch 'wip-database': Work-in-progress database optimization\n Branch 'docs/api': API documentation updates\nSTEP 2: EVALUATE REUSE POTENTIAL\n Evaluating reuse potential for existing branches\n wip-database: score=45, reasons=work-in-progress branch, matching project type\n feature/auth-system: score=35, reasons=feature branch, matching technologies\nSTEP 3: MAKE BRANCH DECISION\n Making branch selection decision\n\ud83e\udd16 AI: Deciding branch selection strategy with 2 candidates\n Decision: REUSE - High compatibility with existing WIP branch\nSTEP 4: GENERATE/SELECT BRANCH NAME\n Finalizing branch selection\n Selected existing branch: wip-database\n============================================================\nBranch selection complete: wip-database\n```\n\n#### Branch Selection Features:\n- **\ud83d\udd0d Multi-branch Analysis**: Examines all branches in the repository\n- **\ud83c\udfaf Smart Scoring**: Evaluates compatibility based on project type, technologies, and purpose\n- **\ud83d\udd04 Reuse Preference**: Strongly favors reusing existing branches (80% bias)\n- **\ud83c\udfd7\ufe0f Branch Classification**: Identifies feature, fix, docs, and WIP branches\n- **\u2699\ufe0f Intelligent Fallback**: Creates new branches with meaningful names when needed\n\n## \ud83d\udc0d Python API\n\n```python\nfrom gitllama import GitAutomator, AICoordinator\n\n# With AI - Full intelligent automation\nai = AICoordinator(model=\"gemma3:4b\")\nwith GitAutomator(ai_coordinator=ai) as automator:\n results = automator.run_full_workflow(\n git_url=\"https://github.com/user/repo.git\"\n )\n \n print(f\"Success: {results['success']}\")\n print(f\"Branch created: {results['branch']}\")\n print(f\"Files modified: {results['modified_files']}\")\n \n # Access detailed AI analysis\n if 'ai_analysis' in results:\n analysis = results['ai_analysis']\n print(f\"Project Type: {analysis['project_type']}\")\n print(f\"Technologies: {', '.join(analysis['technologies'])}\")\n print(f\"Code Quality: {analysis['quality']}\")\n print(f\"Architecture: {analysis['architecture']}\")\n\n# Without AI - Simple automation\nwith GitAutomator() as automator:\n results = automator.run_full_workflow(\n git_url=\"https://github.com/user/repo.git\",\n branch_name=\"my-branch\",\n commit_message=\"My changes\"\n )\n```\n\n## \ud83c\udfd7\ufe0f Architecture\n\nGitLlama is built with a modular architecture for easy extension:\n\n```\ngitllama/\n\u251c\u2500\u2500 cli.py # Command-line interface\n\u251c\u2500\u2500 git_operations.py # Git automation logic\n\u251c\u2500\u2500 ai_coordinator.py # AI workflow coordination\n\u251c\u2500\u2500 project_analyzer.py # Hierarchical project analysis\n\u251c\u2500\u2500 branch_analyzer.py # Intelligent branch selection (NEW!)\n\u251c\u2500\u2500 config.py # Configuration and logging setup\n\u2514\u2500\u2500 ollama_client.py # Ollama API integration\n```\n\n### Key Components:\n\n- **ProjectAnalyzer**: Handles the 5-step hierarchical analysis process\n- **BranchAnalyzer**: Intelligent branch selection with 4-step decision pipeline (NEW!)\n- **AICoordinator**: Orchestrates AI decisions throughout the workflow\n- **GitAutomator**: Manages git operations with optional AI integration\n- **OllamaClient**: Interfaces with local Ollama models\n\n## \ud83e\udd16 AI Models\n\nThe tool works with any Ollama model. Recommended models:\n\n- `gemma3:4b` - Fast and efficient (default)\n- `llama3.2:1b` - Ultra-fast for simple tasks\n- `codellama:7b` - Optimized for code understanding\n- `mistral:7b` - Good general purpose\n- `gemma2:2b` - Very fast, good for simple tasks\n\n### Context Window Sizes:\n- Small models (1-3B): ~2-4K tokens\n- Medium models (7B): ~4-8K tokens\n- Large models (13B+): ~8-16K tokens\n\n## \ud83c\udfaf What Gets Analyzed\n\nGitLlama intelligently analyzes:\n- Source code files (Python, JavaScript, Java, Go, Rust, etc.)\n- Configuration files (JSON, YAML, TOML, etc.)\n- Documentation (Markdown, README, LICENSE)\n- Build files (Dockerfile, Makefile, package.json)\n- Scripts (Shell, Batch, PowerShell)\n- Web assets (HTML, CSS, XML)\n\n## \ud83d\udcca Analysis Results\n\nThe AI provides multi-level insights:\n\n```json\n{\n \"project_type\": \"web-application\",\n \"technologies\": [\"Python\", \"FastAPI\", \"PostgreSQL\", \"React\"],\n \"state\": \"Production-ready with comprehensive test coverage\",\n \"architecture\": \"Microservices with REST API\",\n \"quality\": \"High - follows best practices\",\n \"patterns\": [\"MVC\", \"Repository Pattern\", \"Dependency Injection\"],\n \"analysis_metadata\": {\n \"total_files\": 156,\n \"total_tokens\": 45000,\n \"chunks_created\": 12,\n \"context_window\": 4096,\n \"model\": \"gemma3:4b\"\n }\n}\n```\n\n## \u2699\ufe0f Configuration\n\n```bash\n# Use a different Ollama server\ngitllama https://github.com/user/repo.git --ollama-url http://remote-server:11434\n\n# Use a specific model with more context\ngitllama https://github.com/user/repo.git --model codellama:7b\n\n# Verbose output for debugging\ngitllama https://github.com/user/repo.git --verbose\n```\n\n## \ud83d\udd27 Extending GitLlama\n\nThe modular design makes it easy to add new analysis steps:\n\n```python\n# In project_analyzer.py, each step is clearly separated:\n\ndef _step1_gather_repository_data(self, repo_path):\n \"\"\"STEP 1: Gather all repository data\"\"\"\n # Add git history analysis here\n # Add dependency scanning here\n # Add security checks here\n\ndef _step2_create_chunks(self, files):\n \"\"\"STEP 2: Create smart chunks\"\"\"\n # Add semantic grouping here\n # Add priority-based chunking here\n\ndef _step3_analyze_chunks(self, chunks):\n \"\"\"STEP 3: Analyze each chunk\"\"\"\n # Add code quality metrics here\n # Add security scanning here\n # Add performance analysis here\n```\n\n## \ud83d\udcc8 Performance\n\n- **Small repos (<100 files)**: ~30 seconds\n- **Medium repos (100-500 files)**: ~1-2 minutes\n- **Large repos (500+ files)**: ~2-5 minutes\n\n*Times vary based on model size and system performance*\n\n## \ud83d\udee0\ufe0f Development\n\n```bash\ngit clone https://github.com/your-org/gitllama.git\ncd gitllama\npip install -e \".[dev]\"\n\n# Run tests\npytest\n\n# Check code quality\nmake lint\nmake format\nmake type-check\n```\n\n## \ud83d\udc1b Troubleshooting\n\n### Ollama not available?\n```bash\n# Check if Ollama is running\ncurl http://localhost:11434/api/tags\n\n# Start Ollama\nollama serve\n```\n\n### Context window too small?\n```bash\n# Use a model with larger context\ngitllama repo.git --model mistral:7b\n```\n\n### Analysis taking too long?\n```bash\n# Use a smaller, faster model\ngitllama repo.git --model llama3.2:1b\n```\n\n## \ud83d\udcdd License\n\nGPL v3 - see LICENSE file\n\n## \ud83e\udd1d Contributing\n\nContributions are welcome! The modular architecture makes it easy to add:\n- New analysis steps\n- Additional AI models support\n- More file type handlers\n- Enhanced decision strategies\n\n## \ud83d\ude80 Future Enhancements\n\n- [ ] Git history analysis\n- [ ] Dependency vulnerability scanning\n- [ ] Parallel chunk processing\n- [ ] Code quality metrics\n- [ ] Security analysis\n- [ ] Test coverage assessment\n- [ ] README generation\n- [ ] Automatic PR descriptions\n- [ ] Multi-language documentation\n\n---\n\n**Note**: GitLlama requires git credentials configured for pushing to repositories. Ensure you have proper access rights to the repositories you're modifying.",
"bugtrack_url": null,
"license": "GPL-3.0-or-later",
"summary": "AI-powered git automation tool - clone, branch, change, commit, push with intelligence",
"version": "0.5.0",
"project_urls": {
"Homepage": "https://github.com/your-org/gitllama",
"Issues": "https://github.com/your-org/gitllama/issues",
"Repository": "https://github.com/your-org/gitllama.git"
},
"split_keywords": [
"ai",
" automation",
" devops",
" git",
" llm",
" ollama",
" workflow"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "752ef8ea679a21f66cd80d13ef19ffe6cb11b67db2355ab44c98b5e2190d9b90",
"md5": "4b4ed3a185ea32e18280ba0d956e84fd",
"sha256": "99b2590d6d897c289722645170c12090662bf78b9b8942ceae5e2fc6d7af0d4f"
},
"downloads": -1,
"filename": "gitllama-0.5.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "4b4ed3a185ea32e18280ba0d956e84fd",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 84550,
"upload_time": "2025-08-18T00:48:27",
"upload_time_iso_8601": "2025-08-18T00:48:27.898509Z",
"url": "https://files.pythonhosted.org/packages/75/2e/f8ea679a21f66cd80d13ef19ffe6cb11b67db2355ab44c98b5e2190d9b90/gitllama-0.5.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "2f9bd4c491487dc501d6f14116d677e55e5ed601541628481df79a7376e52e03",
"md5": "ca0e88610dc73cfba8f9ba67281663be",
"sha256": "46f3c19f90a7ce86034839c42fccf6e45fe3c17150e0a802df45bdd862cdfc0a"
},
"downloads": -1,
"filename": "gitllama-0.5.0.tar.gz",
"has_sig": false,
"md5_digest": "ca0e88610dc73cfba8f9ba67281663be",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 79460,
"upload_time": "2025-08-18T00:48:29",
"upload_time_iso_8601": "2025-08-18T00:48:29.121934Z",
"url": "https://files.pythonhosted.org/packages/2f/9b/d4c491487dc501d6f14116d677e55e5ed601541628481df79a7376e52e03/gitllama-0.5.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-18 00:48:29",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "your-org",
"github_project": "gitllama",
"github_not_found": true,
"lcname": "gitllama"
}