# Important: Project Name Change
This project is being renamed from `llm-code-context` to `llm-context` to better reflect its capability to handle various types of text files, not just code. This repository will be renamed, and future releases will be under the new name. Please update your references and dependencies accordingly.
For the latest version and updates, please visit: [https://github.com/cyberchitta/llm-context.py](https://github.com/cyberchitta/llm-context.py)
Thank you for your understanding and continued support!
# LLM Code Context
LLM Code Context is a Python-based tool designed to streamline the process of sharing code context with Large Language Models (LLMs) *using a standard Chat UI*. It allows developers to easily select, format, and copy relevant code snippets and project structure information, enhancing the quality of interactions with AI assistants in coding tasks.
This project was developed with significant input from Claude 3 Opus and Claude 3.5 Sonnet. All of the code that makes it into the repo is human curated (by me 😇, [@restlessronin](https://github.com/restlessronin)).
## Features
- **File Selection**: Offers a command-line interface for selecting files from your project.
- **Intelligent Ignoring**: Respects `.gitignore` rules and additional custom ignore patterns to exclude irrelevant files.
- **Clipboard Integration**: Automatically copies the generated context to your clipboard for easy pasting.
- **Optional Technical Summary**: Allows inclusion of a markdown file summarizing the project's technical aspects.
## Installation
### Using pipx (Recommended)
[pipx](https://pypa.github.io/pipx/) is a tool to help you install and run end-user CLI applications written in Python.
1. If you haven't installed pipx yet, follow the installation instructions in the pipx documentation.
2. Once pipx is installed, you can install LLM Code Context:
```
pipx install llm-code-context
```
This will install LLM Code Context in an isolated environment and make its commands available in your shell.
## Usage
LLM Code Context offers several command-line tools, each designed for a specific task. All commands should be run from the root directory of your project, where the primary `.gitignore` file is located.
Here are the main commands:
```sh
# Select all files which are not gitignored
lcc-select
# Generate full context (including folder structure and summary), using selected files
lcc-gencontext
# Generate full text contents from a list of paths in the clipboard
lcc-genfiles
```
### Typical workflow
Let's say that you are collaborating with an LLM on a code repo. Use a system or custom prompt similar to [this `custom-prompt.md`](.llm-code-context/custom-prompt.md).
#### Provide context for your chat.
1. Navigate to your project's root directory in the terminal.
2. Edit the project configuration file `.llm-code-context/config.json` to add any files to the "gitignores" key that should be in git but may not be useful for code context (e.g., "LICENSE" and "poetry.lock", maybe even "README.md").
3. Run `lcc-select` to choose the files you want to include in your context. You can look at `.llm-code-context/scratch.json` to see what files are currently selected. If you prefer, you can edit the scratch file directly, before the next step.
4. Run `lcc-gencontext` to generate and copy the full text of all selected files, the folder structure diagram and the technical summary of the project (if available).
5. Paste the context into the first message of your conversation with the LLM, or equivalently into a Claude project file.
#### Respond to LLM requests for files
1. The LLM will request a list of files in a markdown block quote.
2. Select the block and copy into the clipboard
3. Run `lcc-genfiles` to copy the text context of the requested files into the clipboard (thus replacing it's original contents - the file list).
4. Paste the file content list into the next user message in the chat.
## Technical Summary
LLM Code Context supports an optional technical summary feature, although **its utility is currently unclear**. This feature allows you to include a markdown file that provides project-specific information that may not be easily inferred from the code alone. To use this feature:
1. Create a markdown file in your `.llm-code-context` folder (e.g., `.llm-code-context/tech-summary.md`).
2. In your `.llm-code-context/config.json` file, set the `summary_file` key to the name of your summary file:
```json
{
"summary_file": "tech-summary.md"
}
```
If the key is missing or null, no summary will be included in the context.
The summary can include information like architectural decisions, non-obvious performance considerations, or future plans. For example:
- "We chose a microservices architecture to allow for independent scaling of components."
- "The `process_data()` function uses custom caching to optimize repeated calls with similar inputs."
- "The authentication system is slated for an overhaul in Q3 to implement OAuth2."
When you run `lcc-gencontext`, this summary will be included after the folder structure diagram in the generated context.
For an example of a technical summary, you can refer to the [`tech-summary.md` file for this repository](.llm-code-context/tech-summary.md).
## Project Structure
```
└── llm-code-context.py
├── .gitignore
├── .llm-code-context
│ ├── .gitignore
│ ├── config.json
│ ├── custom-prompt.md
│ ├── tech-summary.md
│ └── templates
│ ├── full-context.j2
│ └── sel-file-contents.j2
├── LICENSE
├── MANIFEST.in
├── README.md
├── poetry.lock
├── pyproject.toml
├── src
│ └── llm_code_context
│ ├── __init__.py
│ ├── config_manager.py
│ ├── context_generator.py
│ ├── file_selector.py
│ ├── folder_structure_diagram.py
│ ├── git_ignorer.py
│ ├── path_converter.py
│ ├── pathspec_ignorer.py
│ ├── template.py
│ └── templates
│ ├── full-context.j2
│ └── sel-file-contents.j2
└── tests
├── test_path_converter.py
└── test_pathspec_ignorer.py
```
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
## License
This project is licensed under the Apache License, Version 2.0. See the [LICENSE](LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": "https://github.com/cyberchitta/llm-context.py",
"name": "llm-code-context",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.12",
"maintainer_email": null,
"keywords": null,
"author": "restlessronin",
"author_email": "88921269+restlessronin@users.noreply.github.com",
"download_url": "https://files.pythonhosted.org/packages/ee/f3/4ada8f04a7276c96df112e5d03e196ed6fc77b792a8796d966fb4fc59c62/llm_code_context-0.0.9.tar.gz",
"platform": null,
"description": "# Important: Project Name Change\n\nThis project is being renamed from `llm-code-context` to `llm-context` to better reflect its capability to handle various types of text files, not just code. This repository will be renamed, and future releases will be under the new name. Please update your references and dependencies accordingly.\n\nFor the latest version and updates, please visit: [https://github.com/cyberchitta/llm-context.py](https://github.com/cyberchitta/llm-context.py)\n\nThank you for your understanding and continued support!\n\n# LLM Code Context\n\nLLM Code Context is a Python-based tool designed to streamline the process of sharing code context with Large Language Models (LLMs) *using a standard Chat UI*. It allows developers to easily select, format, and copy relevant code snippets and project structure information, enhancing the quality of interactions with AI assistants in coding tasks.\n\nThis project was developed with significant input from Claude 3 Opus and Claude 3.5 Sonnet. All of the code that makes it into the repo is human curated (by me \ud83d\ude07, [@restlessronin](https://github.com/restlessronin)).\n\n## Features\n\n- **File Selection**: Offers a command-line interface for selecting files from your project.\n- **Intelligent Ignoring**: Respects `.gitignore` rules and additional custom ignore patterns to exclude irrelevant files.\n- **Clipboard Integration**: Automatically copies the generated context to your clipboard for easy pasting.\n- **Optional Technical Summary**: Allows inclusion of a markdown file summarizing the project's technical aspects.\n\n## Installation\n\n### Using pipx (Recommended)\n\n[pipx](https://pypa.github.io/pipx/) is a tool to help you install and run end-user CLI applications written in Python.\n\n1. If you haven't installed pipx yet, follow the installation instructions in the pipx documentation.\n2. Once pipx is installed, you can install LLM Code Context:\n ```\n pipx install llm-code-context\n ```\n\nThis will install LLM Code Context in an isolated environment and make its commands available in your shell.\n\n## Usage\n\nLLM Code Context offers several command-line tools, each designed for a specific task. All commands should be run from the root directory of your project, where the primary `.gitignore` file is located.\n\nHere are the main commands:\n\n ```sh\n # Select all files which are not gitignored\n lcc-select\n # Generate full context (including folder structure and summary), using selected files\n lcc-gencontext\n # Generate full text contents from a list of paths in the clipboard\n lcc-genfiles\n ```\n\n### Typical workflow\n\nLet's say that you are collaborating with an LLM on a code repo. Use a system or custom prompt similar to [this `custom-prompt.md`](.llm-code-context/custom-prompt.md).\n\n#### Provide context for your chat.\n\n1. Navigate to your project's root directory in the terminal.\n2. Edit the project configuration file `.llm-code-context/config.json` to add any files to the \"gitignores\" key that should be in git but may not be useful for code context (e.g., \"LICENSE\" and \"poetry.lock\", maybe even \"README.md\").\n3. Run `lcc-select` to choose the files you want to include in your context. You can look at `.llm-code-context/scratch.json` to see what files are currently selected. If you prefer, you can edit the scratch file directly, before the next step.\n4. Run `lcc-gencontext` to generate and copy the full text of all selected files, the folder structure diagram and the technical summary of the project (if available).\n5. Paste the context into the first message of your conversation with the LLM, or equivalently into a Claude project file.\n\n#### Respond to LLM requests for files\n\n1. The LLM will request a list of files in a markdown block quote.\n2. Select the block and copy into the clipboard\n3. Run `lcc-genfiles` to copy the text context of the requested files into the clipboard (thus replacing it's original contents - the file list).\n4. Paste the file content list into the next user message in the chat.\n \n## Technical Summary\n\nLLM Code Context supports an optional technical summary feature, although **its utility is currently unclear**. This feature allows you to include a markdown file that provides project-specific information that may not be easily inferred from the code alone. To use this feature:\n\n1. Create a markdown file in your `.llm-code-context` folder (e.g., `.llm-code-context/tech-summary.md`).\n2. In your `.llm-code-context/config.json` file, set the `summary_file` key to the name of your summary file:\n ```json\n {\n \"summary_file\": \"tech-summary.md\"\n }\n ```\nIf the key is missing or null, no summary will be included in the context.\n\nThe summary can include information like architectural decisions, non-obvious performance considerations, or future plans. For example:\n- \"We chose a microservices architecture to allow for independent scaling of components.\"\n- \"The `process_data()` function uses custom caching to optimize repeated calls with similar inputs.\"\n- \"The authentication system is slated for an overhaul in Q3 to implement OAuth2.\"\n\nWhen you run `lcc-gencontext`, this summary will be included after the folder structure diagram in the generated context.\n\nFor an example of a technical summary, you can refer to the [`tech-summary.md` file for this repository](.llm-code-context/tech-summary.md).\n\n## Project Structure\n\n```\n\u2514\u2500\u2500 llm-code-context.py\n \u251c\u2500\u2500 .gitignore\n \u251c\u2500\u2500 .llm-code-context\n \u2502 \u251c\u2500\u2500 .gitignore\n \u2502 \u251c\u2500\u2500 config.json\n \u2502 \u251c\u2500\u2500 custom-prompt.md\n \u2502 \u251c\u2500\u2500 tech-summary.md\n \u2502 \u2514\u2500\u2500 templates\n \u2502 \u251c\u2500\u2500 full-context.j2\n \u2502 \u2514\u2500\u2500 sel-file-contents.j2\n \u251c\u2500\u2500 LICENSE\n \u251c\u2500\u2500 MANIFEST.in\n \u251c\u2500\u2500 README.md\n \u251c\u2500\u2500 poetry.lock\n \u251c\u2500\u2500 pyproject.toml\n \u251c\u2500\u2500 src\n \u2502 \u2514\u2500\u2500 llm_code_context\n \u2502 \u251c\u2500\u2500 __init__.py\n \u2502 \u251c\u2500\u2500 config_manager.py\n \u2502 \u251c\u2500\u2500 context_generator.py\n \u2502 \u251c\u2500\u2500 file_selector.py\n \u2502 \u251c\u2500\u2500 folder_structure_diagram.py\n \u2502 \u251c\u2500\u2500 git_ignorer.py\n \u2502 \u251c\u2500\u2500 path_converter.py\n \u2502 \u251c\u2500\u2500 pathspec_ignorer.py\n \u2502 \u251c\u2500\u2500 template.py\n \u2502 \u2514\u2500\u2500 templates\n \u2502 \u251c\u2500\u2500 full-context.j2\n \u2502 \u2514\u2500\u2500 sel-file-contents.j2\n \u2514\u2500\u2500 tests\n \u251c\u2500\u2500 test_path_converter.py\n \u2514\u2500\u2500 test_pathspec_ignorer.py\n```\n\n## Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.\n\n## License\n\nThis project is licensed under the Apache License, Version 2.0. See the [LICENSE](LICENSE) file for details.",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "A tool to help with copying and pasting context into LLM chats (Deprecated: Renamed to llm-context)",
"version": "0.0.9",
"project_urls": {
"Homepage": "https://github.com/cyberchitta/llm-context.py",
"Repository": "https://github.com/cyberchitta/llm-context.py"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "14521b2eee4e85eb2a9bc53e919d569c6210108e2a5e21e5647982d9ec89f945",
"md5": "e2774a470bf70c2693629640cafe9943",
"sha256": "af7a100dab33de556f1ad38eb2e14c0eb9f8ad9cbc4c895d6fd86c97a6ffc508"
},
"downloads": -1,
"filename": "llm_code_context-0.0.9-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e2774a470bf70c2693629640cafe9943",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.12",
"size": 14020,
"upload_time": "2024-09-12T05:50:46",
"upload_time_iso_8601": "2024-09-12T05:50:46.678746Z",
"url": "https://files.pythonhosted.org/packages/14/52/1b2eee4e85eb2a9bc53e919d569c6210108e2a5e21e5647982d9ec89f945/llm_code_context-0.0.9-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "eef34ada8f04a7276c96df112e5d03e196ed6fc77b792a8796d966fb4fc59c62",
"md5": "b4eed960613e421c70038c808c76c540",
"sha256": "c887c5cafa558caa4c133fcd530354372da05c727e8c85d39c3e7403bc06f1b0"
},
"downloads": -1,
"filename": "llm_code_context-0.0.9.tar.gz",
"has_sig": false,
"md5_digest": "b4eed960613e421c70038c808c76c540",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.12",
"size": 13814,
"upload_time": "2024-09-12T05:50:48",
"upload_time_iso_8601": "2024-09-12T05:50:48.745453Z",
"url": "https://files.pythonhosted.org/packages/ee/f3/4ada8f04a7276c96df112e5d03e196ed6fc77b792a8796d966fb4fc59c62/llm_code_context-0.0.9.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-12 05:50:48",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "cyberchitta",
"github_project": "llm-context.py",
"github_not_found": true,
"lcname": "llm-code-context"
}