Name | llm-context-cli JSON |
Version |
0.0.5
JSON |
| download |
home_page | https://github.com/atomgradient/llm-context |
Summary | A tool to combine multiple files into a single file for AI context |
upload_time | 2025-02-12 06:10:35 |
maintainer | None |
docs_url | None |
author | AtomGradient |
requires_python | >=3.7 |
license | MIT License
Copyright (c) [2025] [AtomGradient]
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. |
keywords |
llm
context
file
combine
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# LLM Context CLI
<div align="center">
[](https://www.python.org/downloads/)
[](https://opensource.org/licenses/MIT)
[](https://badge.fury.io/py/llm-context)
🚀 A powerful command-line tool to combine multiple files into a single file formatted for Large Language Model (LLM) context.
</div>
## 🌟 Features
- 📁 Combine multiple files into a single document
- 🎯 Smart file pattern matching and exclusion
- 🎨 Markdown-formatted output with syntax highlighting
- 🔄 Preserve file structure and organization
- ⚡️ Fast and efficient processing
- 🛠 Customizable output formatting
## 📋 Prerequisites
- Python 3.7 or higher
- pip (Python package installer)
## 🚀 Installation
Install using pip:
```bash
pip install llm-context-cli
```
Or install with pipx for isolated environments:
```bash
pipx install llm-context-cli
```
## 💻 Usage
### Basic Usage
Simply point the tool to your project directory:
```bash
llm-context-cli /path/to/project
```
### Advanced Usage
Customize the output with various options:
```bash
llm-context-cli /path/to/project \
--patterns "*.py, *.js" \
--exclude "__pycache__/*, node_modules/*" \
--output llmcontext.txt \
--header "Project source code for review:" \
--no-language
```
## 🎮 Command Options
| Option | Short | Description | Example |
|--------|-------|-------------|----------|
| `--patterns` | `-p` | File patterns to include (multiple allowed) | `--patterns "*.py, *.js"` |
| `--exclude` | `-e` | Patterns to exclude (multiple allowed) | `--exclude "test/*, *.pyc"` |
| `--output` | `-o` | Output file path | `--output combined.txt` |
| `--header` | `-h` | Optional header text | `--header "Source code:"` |
| `--language/--no-language` | - | Toggle language hints in code blocks | `--no-language` |
## 📝 Examples
### Combining Python and JavaScript Files
```bash
llm-context-cli . \
--patterns "*.py, *.js" \
--exclude "tests/*" \
--header "Frontend and Backend Source Code:"
```
### Processing a Specific Directory
```bash
llm-context-cli ./src \
--patterns "*.ts" \
--output typescript-code.txt \
--language
```
## 🤝 Contributing
Contributions are welcome! Here's how you can help:
1. Fork the repository
2. Create a new branch (`git checkout -b feature/amazing-feature`)
3. Make your changes
4. Commit your changes (`git commit -m 'Add amazing feature'`)
5. Push to the branch (`git push origin feature/amazing-feature`)
6. Open a Pull Request
## 📄 License
This project is licensed under the MIT License.
## 🙏 Acknowledgments
- Thanks to all contributors who have helped shape this tool
- Inspired by the need for better LLM context management
---
<div align="center">
Made with ❤️ for the AI and developer community
</div>
Raw data
{
"_id": null,
"home_page": "https://github.com/atomgradient/llm-context",
"name": "llm-context-cli",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "llm, context, file, combine",
"author": "AtomGradient",
"author_email": "atomgradient <alex@atomgradient.com>",
"download_url": "https://files.pythonhosted.org/packages/89/24/6764ec61948862931e16dd0f6b51ad894728daa2d82aa2f03ea8f8e53c7c/llm_context_cli-0.0.5.tar.gz",
"platform": null,
"description": "# LLM Context CLI\n\n<div align=\"center\">\n\n[](https://www.python.org/downloads/)\n[](https://opensource.org/licenses/MIT)\n[](https://badge.fury.io/py/llm-context)\n\n\ud83d\ude80 A powerful command-line tool to combine multiple files into a single file formatted for Large Language Model (LLM) context.\n</div>\n\n## \ud83c\udf1f Features\n\n- \ud83d\udcc1 Combine multiple files into a single document\n- \ud83c\udfaf Smart file pattern matching and exclusion\n- \ud83c\udfa8 Markdown-formatted output with syntax highlighting\n- \ud83d\udd04 Preserve file structure and organization\n- \u26a1\ufe0f Fast and efficient processing\n- \ud83d\udee0 Customizable output formatting\n\n## \ud83d\udccb Prerequisites\n\n- Python 3.7 or higher\n- pip (Python package installer)\n\n## \ud83d\ude80 Installation\n\nInstall using pip:\n\n```bash\npip install llm-context-cli\n```\n\nOr install with pipx for isolated environments:\n\n```bash\npipx install llm-context-cli\n```\n\n## \ud83d\udcbb Usage\n\n### Basic Usage\n\nSimply point the tool to your project directory:\n\n```bash\nllm-context-cli /path/to/project\n```\n\n### Advanced Usage\n\nCustomize the output with various options:\n\n```bash\nllm-context-cli /path/to/project \\\n --patterns \"*.py, *.js\" \\\n --exclude \"__pycache__/*, node_modules/*\" \\\n --output llmcontext.txt \\\n --header \"Project source code for review:\" \\\n --no-language\n```\n\n## \ud83c\udfae Command Options\n\n| Option | Short | Description | Example |\n|--------|-------|-------------|----------|\n| `--patterns` | `-p` | File patterns to include (multiple allowed) | `--patterns \"*.py, *.js\"` |\n| `--exclude` | `-e` | Patterns to exclude (multiple allowed) | `--exclude \"test/*, *.pyc\"` |\n| `--output` | `-o` | Output file path | `--output combined.txt` |\n| `--header` | `-h` | Optional header text | `--header \"Source code:\"` |\n| `--language/--no-language` | - | Toggle language hints in code blocks | `--no-language` |\n\n## \ud83d\udcdd Examples\n\n### Combining Python and JavaScript Files\n\n```bash\nllm-context-cli . \\\n --patterns \"*.py, *.js\" \\\n --exclude \"tests/*\" \\\n --header \"Frontend and Backend Source Code:\"\n```\n\n### Processing a Specific Directory\n\n```bash\nllm-context-cli ./src \\\n --patterns \"*.ts\" \\\n --output typescript-code.txt \\\n --language\n```\n\n## \ud83e\udd1d Contributing\n\nContributions are welcome! Here's how you can help:\n\n1. Fork the repository\n2. Create a new branch (`git checkout -b feature/amazing-feature`)\n3. Make your changes\n4. Commit your changes (`git commit -m 'Add amazing feature'`)\n5. Push to the branch (`git push origin feature/amazing-feature`)\n6. Open a Pull Request\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the MIT License.\n\n## \ud83d\ude4f Acknowledgments\n\n- Thanks to all contributors who have helped shape this tool\n- Inspired by the need for better LLM context management\n\n---\n\n<div align=\"center\">\nMade with \u2764\ufe0f for the AI and developer community\n</div>\n",
"bugtrack_url": null,
"license": "MIT License\n \n Copyright (c) [2025] [AtomGradient]\n \n Permission is hereby granted, free of charge, to any person obtaining a copy\n of this software and associated documentation files (the \"Software\"), to deal\n in the Software without restriction, including without limitation the rights\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n copies of the Software, and to permit persons to whom the Software is\n furnished to do so, subject to the following conditions:\n \n The above copyright notice and this permission notice shall be included in all\n copies or substantial portions of the Software.\n \n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n SOFTWARE.",
"summary": "A tool to combine multiple files into a single file for AI context",
"version": "0.0.5",
"project_urls": {
"Homepage": "https://github.com/AtomGradient/llm-context"
},
"split_keywords": [
"llm",
" context",
" file",
" combine"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "ff0a83aecc7b5f11eeeafbfd00d8b9c2b126fc51e52e53453f27441c9573e233",
"md5": "8ef7163cb03160a3d617498a48c44146",
"sha256": "f22a0d4cc72b2140f73839f18f19663fd4a3605e661c3e6b73f76cd161f88572"
},
"downloads": -1,
"filename": "llm_context_cli-0.0.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "8ef7163cb03160a3d617498a48c44146",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 6007,
"upload_time": "2025-02-12T06:10:32",
"upload_time_iso_8601": "2025-02-12T06:10:32.773604Z",
"url": "https://files.pythonhosted.org/packages/ff/0a/83aecc7b5f11eeeafbfd00d8b9c2b126fc51e52e53453f27441c9573e233/llm_context_cli-0.0.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "89246764ec61948862931e16dd0f6b51ad894728daa2d82aa2f03ea8f8e53c7c",
"md5": "6d02efd799e538569eebfa5280cf7b01",
"sha256": "8b1d8753f61dc718e8a5f780edfa6a020f6b43aac167115da068cdcd15248ddc"
},
"downloads": -1,
"filename": "llm_context_cli-0.0.5.tar.gz",
"has_sig": false,
"md5_digest": "6d02efd799e538569eebfa5280cf7b01",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 5175,
"upload_time": "2025-02-12T06:10:35",
"upload_time_iso_8601": "2025-02-12T06:10:35.756216Z",
"url": "https://files.pythonhosted.org/packages/89/24/6764ec61948862931e16dd0f6b51ad894728daa2d82aa2f03ea8f8e53c7c/llm_context_cli-0.0.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-12 06:10:35",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "atomgradient",
"github_project": "llm-context",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "llm-context-cli"
}