Name | llm-context JSON |
Version |
0.2.13
JSON |
| download |
home_page | None |
Summary | Share code with LLMs via Model Context Protocol or clipboard. Profile-based customization enables easy switching between different tasks (like code review and documentation). Code outlining support is available as an experimental feature. |
upload_time | 2025-02-28 05:42:06 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.10 |
license | None |
keywords |
ai
chat
clipboard
code
context
llm
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# LLM Context
[](https://opensource.org/licenses/Apache-2.0)
[](https://pypi.org/project/llm-context/)
LLM Context is a tool that helps developers quickly inject relevant content from code/text projects into Large Language Model chat interfaces. It leverages `.gitignore` patterns for smart file selection and provides both a streamlined clipboard workflow using the command line and direct LLM integration through the Model Context Protocol (MCP).
> **Note**: This project was developed in collaboration with Claude-3.5-Sonnet (and more recently Grok-3), using LLM Context itself to share code during development. All code in the repository is human-curated (by me 😇, @restlessronin).
## Important: Configuration File Format Change
Configuration files were converted from TOML to YAML in v 0.2.9. Existing users **must manually convert** any customizations in `.llm-context/config.yaml` files to the new `.llm-context/config.yaml`.
## Why LLM Context?
For an in-depth exploration of the reasoning behind LLM Context and its approach to AI-assisted development, check out our article: [LLM Context: Harnessing Vanilla AI Chats for Development](https://www.cyberchitta.cc/articles/llm-ctx-why.html)
## Current Usage Patterns
- **Direct LLM Integration**: Native integration with Claude Desktop via MCP protocol
- **Chat Interface Support**: Works with any LLM chat interface via CLI/clipboard
- Optimized for interfaces with persistent context like Claude Projects and Custom GPTs
- Works equally well with standard chat interfaces
- **Project Types**: Suitable for code repositories and collections of text/markdown/html documents
- **Project Size**: Optimized for projects that fit within an LLM's context window. Large project support is in development
## Installation
Install LLM Context using [uv](https://github.com/astral-sh/uv):
```bash
uv tool install llm-context
```
To upgrade to the latest version:
```bash
uv tool upgrade llm-context
```
> **Warning**: LLM Context is under active development. Updates may overwrite configuration files prefixed with `lc-`. We recommend all configuration files be version controlled for this reason.
## Quickstart
### MCP with Claude Desktop
Add to 'claude_desktop_config.json':
```jsonc
{
"mcpServers": {
"CyberChitta": {
"command": "uvx",
"args": ["--from", "llm-context", "lc-mcp"]
}
}
}
```
Once configured, you can start working with your project in two simple ways:
1. Say: "I would like to work with my project"
Claude will ask you for the project root path.
2. Or directly specify: "I would like to work with my project /path/to/your/project"
Claude will automatically load the project context.
### CLI Quick Start and Typical Workflow
1. Navigate to your project's root directory
2. Initialize repository: `lc-init` (only needed once)
3. (Optional) Edit `.llm-context/config.yaml` to customize ignore patterns
4. Select files: `lc-sel-files`
5. (Optional) Review selected files in `.llm-context/curr_ctx.yaml`
6. Generate context: `lc-context`
7. Use with your preferred interface:
- Project Knowledge (Claude Pro): Paste into knowledge section
- GPT Knowledge (Custom GPTs): Paste into knowledge section
- Regular chats: Use `lc-set-profile code-prompt` first to include instructions
8. When the LLM requests additional files:
- Copy the file list from the LLM
- Run `lc-read-cliplist`
- Paste the contents back to the LLM
## Core Commands
- `lc-init`: Initialize project configuration
- `lc-set-profile <name>`: Switch profiles
- `lc-sel-files`: Select files for inclusion
- `lc-context`: Generate and copy context
- `lc-prompt`: Generate project instructions for LLMs
- `lc-read-cliplist`: Process LLM file requests
- `lc-changed`: List files modified since last context generation
- `lc-outlines`: Generate outlines for code files (requires [outline] extra at uv install time)
## Features & Advanced Usage
LLM Context provides advanced features for customizing how project content is captured and presented:
- Smart file selection using `.gitignore` patterns
- Multiple profiles for different use cases
- Code outline generation for supported languages
- Easy viewing of code structure with `lc-outlines` command
- Customizable templates and prompts
See our [User Guide](docs/user-guide.md) for detailed documentation of these features.
## Similar Tools
Check out our [comprehensive list of alternatives](https://www.cyberchitta.cc/articles/lc-alternatives.html) - the sheer number of tools tackling this problem demonstrates its importance to the developer community.
## Acknowledgments
LLM Context evolves from a lineage of AI-assisted development tools:
- This project succeeds [LLM Code Highlighter](https://github.com/restlessronin/llm-code-highlighter), a TypeScript library I developed for IDE integration.
- The concept originated from my work on [RubberDuck](https://github.com/rubberduck-ai/rubberduck-vscode) and continued with later contributions to [Continue](https://github.com/continuedev/continuedev).
- LLM Code Highlighter was heavily inspired by [Aider Chat](https://github.com/paul-gauthier/aider). I worked with GPT-4 to translate several Aider Chat Python modules into TypeScript, maintaining functionality while restructuring the code.
- This project uses tree-sitter [tag query files](src/llm_context/highlighter/tag-qry/) from Aider Chat.
- LLM Context exemplifies the power of AI-assisted development, transitioning from Python to TypeScript and back to Python with the help of GPT-4 and Claude-3.5-Sonnet.
I am grateful for the open-source community's innovations and the AI assistance that have shaped this project's evolution.
I am grateful for the help of Claude-3.5-Sonnet in the development of this project.
## License
This project is licensed under the Apache License, Version 2.0. See the [LICENSE](LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "llm-context",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "ai, chat, clipboard, code, context, llm",
"author": null,
"author_email": "restlessronin <88921269+restlessronin@users.noreply.github.com>",
"download_url": "https://files.pythonhosted.org/packages/a9/fb/1faa808c9674202c42e74b7c3bf1bced5b6e80f0af476201a5f0ca53db53/llm_context-0.2.13.tar.gz",
"platform": null,
"description": "# LLM Context\n\n[](https://opensource.org/licenses/Apache-2.0)\n[](https://pypi.org/project/llm-context/)\n\nLLM Context is a tool that helps developers quickly inject relevant content from code/text projects into Large Language Model chat interfaces. It leverages `.gitignore` patterns for smart file selection and provides both a streamlined clipboard workflow using the command line and direct LLM integration through the Model Context Protocol (MCP).\n\n> **Note**: This project was developed in collaboration with Claude-3.5-Sonnet (and more recently Grok-3), using LLM Context itself to share code during development. All code in the repository is human-curated (by me \ud83d\ude07, @restlessronin).\n\n## Important: Configuration File Format Change\n\nConfiguration files were converted from TOML to YAML in v 0.2.9. Existing users **must manually convert** any customizations in `.llm-context/config.yaml` files to the new `.llm-context/config.yaml`.\n\n## Why LLM Context?\n\nFor an in-depth exploration of the reasoning behind LLM Context and its approach to AI-assisted development, check out our article: [LLM Context: Harnessing Vanilla AI Chats for Development](https://www.cyberchitta.cc/articles/llm-ctx-why.html)\n\n## Current Usage Patterns\n\n- **Direct LLM Integration**: Native integration with Claude Desktop via MCP protocol\n- **Chat Interface Support**: Works with any LLM chat interface via CLI/clipboard\n - Optimized for interfaces with persistent context like Claude Projects and Custom GPTs\n - Works equally well with standard chat interfaces\n- **Project Types**: Suitable for code repositories and collections of text/markdown/html documents\n- **Project Size**: Optimized for projects that fit within an LLM's context window. Large project support is in development\n\n## Installation\n\nInstall LLM Context using [uv](https://github.com/astral-sh/uv):\n\n```bash\nuv tool install llm-context\n```\n\nTo upgrade to the latest version:\n\n```bash\nuv tool upgrade llm-context\n```\n\n> **Warning**: LLM Context is under active development. Updates may overwrite configuration files prefixed with `lc-`. We recommend all configuration files be version controlled for this reason.\n\n## Quickstart\n\n### MCP with Claude Desktop\n\nAdd to 'claude_desktop_config.json':\n\n```jsonc\n{\n \"mcpServers\": {\n \"CyberChitta\": {\n \"command\": \"uvx\",\n \"args\": [\"--from\", \"llm-context\", \"lc-mcp\"]\n }\n }\n}\n```\n\nOnce configured, you can start working with your project in two simple ways:\n\n1. Say: \"I would like to work with my project\"\n Claude will ask you for the project root path.\n\n2. Or directly specify: \"I would like to work with my project /path/to/your/project\"\n Claude will automatically load the project context.\n\n### CLI Quick Start and Typical Workflow\n\n1. Navigate to your project's root directory\n2. Initialize repository: `lc-init` (only needed once)\n3. (Optional) Edit `.llm-context/config.yaml` to customize ignore patterns\n4. Select files: `lc-sel-files`\n5. (Optional) Review selected files in `.llm-context/curr_ctx.yaml`\n6. Generate context: `lc-context`\n7. Use with your preferred interface:\n\n- Project Knowledge (Claude Pro): Paste into knowledge section\n- GPT Knowledge (Custom GPTs): Paste into knowledge section\n- Regular chats: Use `lc-set-profile code-prompt` first to include instructions\n\n8. When the LLM requests additional files:\n - Copy the file list from the LLM\n - Run `lc-read-cliplist`\n - Paste the contents back to the LLM\n\n## Core Commands\n\n- `lc-init`: Initialize project configuration\n- `lc-set-profile <name>`: Switch profiles\n- `lc-sel-files`: Select files for inclusion\n- `lc-context`: Generate and copy context\n- `lc-prompt`: Generate project instructions for LLMs\n- `lc-read-cliplist`: Process LLM file requests\n- `lc-changed`: List files modified since last context generation\n- `lc-outlines`: Generate outlines for code files (requires [outline] extra at uv install time)\n\n## Features & Advanced Usage\n\nLLM Context provides advanced features for customizing how project content is captured and presented:\n\n- Smart file selection using `.gitignore` patterns\n- Multiple profiles for different use cases\n- Code outline generation for supported languages\n- Easy viewing of code structure with `lc-outlines` command\n- Customizable templates and prompts\n\nSee our [User Guide](docs/user-guide.md) for detailed documentation of these features.\n\n## Similar Tools\n\nCheck out our [comprehensive list of alternatives](https://www.cyberchitta.cc/articles/lc-alternatives.html) - the sheer number of tools tackling this problem demonstrates its importance to the developer community.\n\n## Acknowledgments\n\nLLM Context evolves from a lineage of AI-assisted development tools:\n\n- This project succeeds [LLM Code Highlighter](https://github.com/restlessronin/llm-code-highlighter), a TypeScript library I developed for IDE integration.\n- The concept originated from my work on [RubberDuck](https://github.com/rubberduck-ai/rubberduck-vscode) and continued with later contributions to [Continue](https://github.com/continuedev/continuedev).\n- LLM Code Highlighter was heavily inspired by [Aider Chat](https://github.com/paul-gauthier/aider). I worked with GPT-4 to translate several Aider Chat Python modules into TypeScript, maintaining functionality while restructuring the code.\n- This project uses tree-sitter [tag query files](src/llm_context/highlighter/tag-qry/) from Aider Chat.\n- LLM Context exemplifies the power of AI-assisted development, transitioning from Python to TypeScript and back to Python with the help of GPT-4 and Claude-3.5-Sonnet.\n\nI am grateful for the open-source community's innovations and the AI assistance that have shaped this project's evolution.\n\nI am grateful for the help of Claude-3.5-Sonnet in the development of this project.\n\n## License\n\nThis project is licensed under the Apache License, Version 2.0. See the [LICENSE](LICENSE) file for details.\n",
"bugtrack_url": null,
"license": null,
"summary": "Share code with LLMs via Model Context Protocol or clipboard. Profile-based customization enables easy switching between different tasks (like code review and documentation). Code outlining support is available as an experimental feature.",
"version": "0.2.13",
"project_urls": {
"Repository": "https://github.com/cyberchitta/llm-context.py",
"User Guide": "https://github.com/cyberchitta/llm-context.py/blob/main/docs/user-guide.md"
},
"split_keywords": [
"ai",
" chat",
" clipboard",
" code",
" context",
" llm"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "f8fbcc2d6ce9301afad788041b7e16091978341d7c64eda15d6265f12786762e",
"md5": "f9f1d6801b55a5cc6d85028dc403bad0",
"sha256": "99be4c1865af39be60fdfbac424a571c193e632b02564822003d05218208838e"
},
"downloads": -1,
"filename": "llm_context-0.2.13-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f9f1d6801b55a5cc6d85028dc403bad0",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 44742,
"upload_time": "2025-02-28T05:42:04",
"upload_time_iso_8601": "2025-02-28T05:42:04.326826Z",
"url": "https://files.pythonhosted.org/packages/f8/fb/cc2d6ce9301afad788041b7e16091978341d7c64eda15d6265f12786762e/llm_context-0.2.13-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "a9fb1faa808c9674202c42e74b7c3bf1bced5b6e80f0af476201a5f0ca53db53",
"md5": "6a94d83b050dd98eaf94e6c6913c43b6",
"sha256": "d8e565d899112c9d1c7a122bfb0cd817d21738732f32ecbca948305c3725f14d"
},
"downloads": -1,
"filename": "llm_context-0.2.13.tar.gz",
"has_sig": false,
"md5_digest": "6a94d83b050dd98eaf94e6c6913c43b6",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 29894,
"upload_time": "2025-02-28T05:42:06",
"upload_time_iso_8601": "2025-02-28T05:42:06.105634Z",
"url": "https://files.pythonhosted.org/packages/a9/fb/1faa808c9674202c42e74b7c3bf1bced5b6e80f0af476201a5f0ca53db53/llm_context-0.2.13.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-28 05:42:06",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "cyberchitta",
"github_project": "llm-context.py",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "llm-context"
}