llm-context


Namellm-context JSON
Version 0.2.0 PyPI version JSON
download
home_pageNone
SummaryA command-line tool for copying code context to clipboard for use in LLM chats
upload_time2024-12-06 18:01:52
maintainerNone
docs_urlNone
authorNone
requires_python<3.13,>=3.10
licenseApache-2.0
keywords ai chat clipboard code context llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LLM Context

[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
[![PyPI version](https://img.shields.io/pypi/v/llm-context.svg)](https://pypi.org/project/llm-context/)

LLM Context is a command-line tool that helps developers quickly inject relevant content from code/text projects into Large Language Model chat interfaces. It leverages `.gitignore` patterns for smart file selection and provides both a streamlined clipboard workflow and direct LLM integration through the Model Context Protocol (MCP).

> **Note**: This project was developed in collaboration with Claude-3.5-Sonnet, using LLM Context itself to share code during development. All code in the repository is human-curated (by me 😇, @restlessronin).

## Why LLM Context?

For an in-depth exploration of the reasoning behind LLM Context and its approach to AI-assisted development, check out our article: [LLM Context: Harnessing Vanilla AI Chats for Development](https://www.cyberchitta.cc/articles/llm-ctx-why.html)

## Current Usage Patterns

- **Direct LLM Integration**: Native integration with Claude Desktop via MCP protocol
- **Chat Interface Support**: Works with any LLM chat interface via CLI/clipboard
  - Optimized for interfaces with persistent context like Claude Projects and Custom GPTs
  - Works equally well with standard chat interfaces
- **Project Types**: Suitable for code repositories and collections of text/markdown/html documents
- **Project Size**: Optimized for projects that fit within an LLM's context window. Large project support is in development

## Installation

Install LLM Context using [uv](https://github.com/astral-sh/uv):

```bash
uv tool install llm-context
```

> **Warning**: LLM Context is under active development. Updates may overwrite configuration files prefixed with `lc-`. We recommend backing up any customized files before updating.

## Quickstart

### MCP with Claude Desktop

Add to 'claude_desktop_config.json':

```jsonc
{
  "mcpServers": {
    "CyberChitta": {
      "command": "uvx",
      "args": ["--from", "llm-context", "lc-mcp"]
    }
  }
}
```

### CLI Quick Start and Typical Workflow

1. Navigate to your project's root directory
2. Initialize repository: `lc-init` (only needed once)
3. (Optional) Edit `.llm-context/config.toml` to customize ignore patterns
4. Select files: `lc-sel-files`
5. (Optional) Review selected files in `.llm-context/curr_ctx.toml`
6. Generate context: `lc-context`
7. Use with your preferred interface:

- Project Knowledge (Claude Pro): Paste into knowledge section
- GPT Knowledge (Custom GPTs): Paste into knowledge section
- Regular chats: Use `lc-profile code-prompt` first to include instructions

8. When the LLM requests additional files:
   - Copy the file list from the LLM
   - Run `lc-read-cliplist`
   - Paste the contents back to the LLM

## Core Commands

- `lc-init`: Initialize project configuration
- `lc-profile <name>`: Switch profiles
- `lc-sel-files`: Select files for inclusion
- `lc-context`: Generate and copy context
- `lc-read-cliplist`: Process LLM file requests

## Features & Advanced Usage

LLM Context provides advanced features for customizing how project content is captured and presented:

- Smart file selection using `.gitignore` patterns
- Multiple profiles for different use cases
- Code outline generation for supported languages
- Customizable templates and prompts

See our [User Guide](docs/user-guide.md) for detailed documentation of these features.

## Acknowledgments

LLM Context evolves from a lineage of AI-assisted development tools:

- This project succeeds [LLM Code Highlighter](https://github.com/restlessronin/llm-code-highlighter), a TypeScript library I developed for IDE integration.
- The concept originated from my work on [RubberDuck](https://github.com/rubberduck-ai/rubberduck-vscode) and continued with later contributions to [Continue](https://github.com/continuedev/continuedev).
- LLM Code Highlighter was heavily inspired by [Aider Chat](https://github.com/paul-gauthier/aider). I worked with GPT-4 to translate several Aider Chat Python modules into TypeScript, maintaining functionality while restructuring the code.
- This project uses tree-sitter [tag query files](src/llm_context/highlighter/tag-qry/) from Aider Chat.
- LLM Context exemplifies the power of AI-assisted development, transitioning from Python to TypeScript and back to Python with the help of GPT-4 and Claude-3.5-Sonnet.

I am grateful for the open-source community's innovations and the AI assistance that have shaped this project's evolution.

I am grateful for the help of Claude-3.5-Sonnet in the development of this project.

## License

This project is licensed under the Apache License, Version 2.0. See the [LICENSE](LICENSE) file for details.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llm-context",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.13,>=3.10",
    "maintainer_email": null,
    "keywords": "ai, chat, clipboard, code, context, llm",
    "author": null,
    "author_email": "restlessronin <88921269+restlessronin@users.noreply.github.com>",
    "download_url": "https://files.pythonhosted.org/packages/2c/f0/f12786f367b115283972bd19ae7b132bd578e648176293a98520878de73c/llm_context-0.2.0.tar.gz",
    "platform": null,
    "description": "# LLM Context\n\n[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)\n[![PyPI version](https://img.shields.io/pypi/v/llm-context.svg)](https://pypi.org/project/llm-context/)\n\nLLM Context is a command-line tool that helps developers quickly inject relevant content from code/text projects into Large Language Model chat interfaces. It leverages `.gitignore` patterns for smart file selection and provides both a streamlined clipboard workflow and direct LLM integration through the Model Context Protocol (MCP).\n\n> **Note**: This project was developed in collaboration with Claude-3.5-Sonnet, using LLM Context itself to share code during development. All code in the repository is human-curated (by me \ud83d\ude07, @restlessronin).\n\n## Why LLM Context?\n\nFor an in-depth exploration of the reasoning behind LLM Context and its approach to AI-assisted development, check out our article: [LLM Context: Harnessing Vanilla AI Chats for Development](https://www.cyberchitta.cc/articles/llm-ctx-why.html)\n\n## Current Usage Patterns\n\n- **Direct LLM Integration**: Native integration with Claude Desktop via MCP protocol\n- **Chat Interface Support**: Works with any LLM chat interface via CLI/clipboard\n  - Optimized for interfaces with persistent context like Claude Projects and Custom GPTs\n  - Works equally well with standard chat interfaces\n- **Project Types**: Suitable for code repositories and collections of text/markdown/html documents\n- **Project Size**: Optimized for projects that fit within an LLM's context window. Large project support is in development\n\n## Installation\n\nInstall LLM Context using [uv](https://github.com/astral-sh/uv):\n\n```bash\nuv tool install llm-context\n```\n\n> **Warning**: LLM Context is under active development. Updates may overwrite configuration files prefixed with `lc-`. We recommend backing up any customized files before updating.\n\n## Quickstart\n\n### MCP with Claude Desktop\n\nAdd to 'claude_desktop_config.json':\n\n```jsonc\n{\n  \"mcpServers\": {\n    \"CyberChitta\": {\n      \"command\": \"uvx\",\n      \"args\": [\"--from\", \"llm-context\", \"lc-mcp\"]\n    }\n  }\n}\n```\n\n### CLI Quick Start and Typical Workflow\n\n1. Navigate to your project's root directory\n2. Initialize repository: `lc-init` (only needed once)\n3. (Optional) Edit `.llm-context/config.toml` to customize ignore patterns\n4. Select files: `lc-sel-files`\n5. (Optional) Review selected files in `.llm-context/curr_ctx.toml`\n6. Generate context: `lc-context`\n7. Use with your preferred interface:\n\n- Project Knowledge (Claude Pro): Paste into knowledge section\n- GPT Knowledge (Custom GPTs): Paste into knowledge section\n- Regular chats: Use `lc-profile code-prompt` first to include instructions\n\n8. When the LLM requests additional files:\n   - Copy the file list from the LLM\n   - Run `lc-read-cliplist`\n   - Paste the contents back to the LLM\n\n## Core Commands\n\n- `lc-init`: Initialize project configuration\n- `lc-profile <name>`: Switch profiles\n- `lc-sel-files`: Select files for inclusion\n- `lc-context`: Generate and copy context\n- `lc-read-cliplist`: Process LLM file requests\n\n## Features & Advanced Usage\n\nLLM Context provides advanced features for customizing how project content is captured and presented:\n\n- Smart file selection using `.gitignore` patterns\n- Multiple profiles for different use cases\n- Code outline generation for supported languages\n- Customizable templates and prompts\n\nSee our [User Guide](docs/user-guide.md) for detailed documentation of these features.\n\n## Acknowledgments\n\nLLM Context evolves from a lineage of AI-assisted development tools:\n\n- This project succeeds [LLM Code Highlighter](https://github.com/restlessronin/llm-code-highlighter), a TypeScript library I developed for IDE integration.\n- The concept originated from my work on [RubberDuck](https://github.com/rubberduck-ai/rubberduck-vscode) and continued with later contributions to [Continue](https://github.com/continuedev/continuedev).\n- LLM Code Highlighter was heavily inspired by [Aider Chat](https://github.com/paul-gauthier/aider). I worked with GPT-4 to translate several Aider Chat Python modules into TypeScript, maintaining functionality while restructuring the code.\n- This project uses tree-sitter [tag query files](src/llm_context/highlighter/tag-qry/) from Aider Chat.\n- LLM Context exemplifies the power of AI-assisted development, transitioning from Python to TypeScript and back to Python with the help of GPT-4 and Claude-3.5-Sonnet.\n\nI am grateful for the open-source community's innovations and the AI assistance that have shaped this project's evolution.\n\nI am grateful for the help of Claude-3.5-Sonnet in the development of this project.\n\n## License\n\nThis project is licensed under the Apache License, Version 2.0. See the [LICENSE](LICENSE) file for details.\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "A command-line tool for copying code context to clipboard for use in LLM chats",
    "version": "0.2.0",
    "project_urls": null,
    "split_keywords": [
        "ai",
        " chat",
        " clipboard",
        " code",
        " context",
        " llm"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "786577f78ccc74f39aef1adae3cf1693c909afafa1ed0e07b971ebd0b1d60ffe",
                "md5": "197e9674ccef6781b7012de00ec9d510",
                "sha256": "55e34f8961da0200d19970cb5cbebe793f4239e741a344777f992ef9d4b2f0c8"
            },
            "downloads": -1,
            "filename": "llm_context-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "197e9674ccef6781b7012de00ec9d510",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.13,>=3.10",
            "size": 41326,
            "upload_time": "2024-12-06T18:01:50",
            "upload_time_iso_8601": "2024-12-06T18:01:50.664184Z",
            "url": "https://files.pythonhosted.org/packages/78/65/77f78ccc74f39aef1adae3cf1693c909afafa1ed0e07b971ebd0b1d60ffe/llm_context-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "2cf0f12786f367b115283972bd19ae7b132bd578e648176293a98520878de73c",
                "md5": "eed7042a0ddf14c8efaef6086e2eaa0c",
                "sha256": "565520d870e04a2735e0abdf2366700ac9dbdc21584a8853ce9d1fa9cd28fa19"
            },
            "downloads": -1,
            "filename": "llm_context-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "eed7042a0ddf14c8efaef6086e2eaa0c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.13,>=3.10",
            "size": 73100,
            "upload_time": "2024-12-06T18:01:52",
            "upload_time_iso_8601": "2024-12-06T18:01:52.367266Z",
            "url": "https://files.pythonhosted.org/packages/2c/f0/f12786f367b115283972bd19ae7b132bd578e648176293a98520878de73c/llm_context-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-06 18:01:52",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llm-context"
}
        
Elapsed time: 0.42055s