streamlm


Namestreamlm JSON
Version 0.1.10 PyPI version JSON
download
home_pageNone
SummaryA command-line interface for interacting with various Large Language Models with streaming markdown output
upload_time2025-07-11 20:19:29
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseMIT
keywords ai anthropic cli deepseek gemini llm markdown ollama openai streaming xai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # StreamLM

[![Downloads](https://static.pepy.tech/badge/streamlm)](https://pepy.tech/project/streamlm)
[![PyPI version](https://badge.fury.io/py/streamlm.svg)](https://badge.fury.io/py/streamlm)
[![GitHub Release](https://img.shields.io/github/v/release/jeffmylife/streamlm)](https://github.com/jeffmylife/streamlm/releases)
[![Build Status](https://github.com/jeffmylife/streamlm/workflows/Test/badge.svg)](https://github.com/jeffmylife/streamlm/actions)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)

A command-line interface for interacting with various Large Language Models with beautiful markdown-formatted responses.

## Installation

### uv (recommended)

```bash
uv tool install streamlm
```

### PyPI
```bash
pip install streamlm
```

### Homebrew (macOS/Linux)
```bash
brew install jeffmylife/streamlm/streamlm
```

## Usage

After installation, you can use the `lm` command:

```bash
lm explain quantum computing
lm -m gpt-4o "write a Python function"
lm -m claude-3-5-sonnet "analyze this data"
```

### Raw Markdown Output

StreamLM includes beautiful built-in markdown formatting, but you can also output raw markdown for piping to other tools:

```bash
# Output raw markdown without Rich formatting
lm --md "explain machine learning" > output.md

# Pipe to your favorite markdown formatter (like glow)
lm --md "write a Python tutorial" | glow

# Use with other markdown tools
lm --raw "create documentation" | pandoc -f markdown -t html
```

### Supported Models

StreamLM provides access to various Large Language Models including:

- **OpenAI**: GPT-4o, o1, o3-mini, GPT-4o-mini
- **Anthropic**: Claude-3-7-sonnet, Claude-3-5-sonnet, Claude-3-5-haiku
- **Google**: Gemini-2.5-flash, Gemini-2.5-pro, Gemini-2.0-flash-thinking
- **DeepSeek**: DeepSeek-R1, DeepSeek-V3
- **xAI**: Grok-4, Grok-3-beta, Grok-3-mini-beta
- **Local models**: Via Ollama (Llama3.3, Qwen2.5, DeepSeek-Coder, etc.)

### Options

- `--model` / `-m`: Choose the LLM model
- `--image` / `-i`: Include image files for vision models
- `--context` / `-c`: Add context from a file
- `--max-tokens` / `-t`: Set maximum response length
- `--temperature` / `-temp`: Control response creativity (0.0-1.0)
- `--think`: Show reasoning process (for reasoning models)
- `--debug` / `-d`: Enable debug mode
- `--raw` / `--md`: Output raw markdown without Rich formatting

## Features

- 🎨 Beautiful markdown-formatted responses
- 🖼️ Image input support for compatible models
- 📁 Context file support
- 🧠 Reasoning model support (DeepSeek, OpenAI o1, etc.)
- 🔧 Extensive model support across providers
- ⚡ Fast and lightweight
- 🛠️ Easy configuration

## Links

- [PyPI Package](https://pypi.org/project/streamlm/)
- [Homebrew Tap](https://github.com/jeffmylife/homebrew-streamlm)
- [Issues](https://github.com/jeffmylife/streamlm/issues)

## License

MIT License - see [LICENSE](LICENSE) file for details.

## Development

```bash
# Make your changes
uv version --bump patch
git add .
git commit -m "feat: your changes"
git push

# Create GitHub release (this triggers everything automatically)
gh release create v0.1.4 --generate-notes
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "streamlm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "ai, anthropic, cli, deepseek, gemini, llm, markdown, ollama, openai, streaming, xai",
    "author": null,
    "author_email": "Jeffrey Lemoine <jeffmylife@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/f2/ec/c6dde20a705ce6f923b09acd5a2a4e4fabdf9e047fc873b2bcafcb61473f/streamlm-0.1.10.tar.gz",
    "platform": null,
    "description": "# StreamLM\n\n[![Downloads](https://static.pepy.tech/badge/streamlm)](https://pepy.tech/project/streamlm)\n[![PyPI version](https://badge.fury.io/py/streamlm.svg)](https://badge.fury.io/py/streamlm)\n[![GitHub Release](https://img.shields.io/github/v/release/jeffmylife/streamlm)](https://github.com/jeffmylife/streamlm/releases)\n[![Build Status](https://github.com/jeffmylife/streamlm/workflows/Test/badge.svg)](https://github.com/jeffmylife/streamlm/actions)\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n\nA command-line interface for interacting with various Large Language Models with beautiful markdown-formatted responses.\n\n## Installation\n\n### uv (recommended)\n\n```bash\nuv tool install streamlm\n```\n\n### PyPI\n```bash\npip install streamlm\n```\n\n### Homebrew (macOS/Linux)\n```bash\nbrew install jeffmylife/streamlm/streamlm\n```\n\n## Usage\n\nAfter installation, you can use the `lm` command:\n\n```bash\nlm explain quantum computing\nlm -m gpt-4o \"write a Python function\"\nlm -m claude-3-5-sonnet \"analyze this data\"\n```\n\n### Raw Markdown Output\n\nStreamLM includes beautiful built-in markdown formatting, but you can also output raw markdown for piping to other tools:\n\n```bash\n# Output raw markdown without Rich formatting\nlm --md \"explain machine learning\" > output.md\n\n# Pipe to your favorite markdown formatter (like glow)\nlm --md \"write a Python tutorial\" | glow\n\n# Use with other markdown tools\nlm --raw \"create documentation\" | pandoc -f markdown -t html\n```\n\n### Supported Models\n\nStreamLM provides access to various Large Language Models including:\n\n- **OpenAI**: GPT-4o, o1, o3-mini, GPT-4o-mini\n- **Anthropic**: Claude-3-7-sonnet, Claude-3-5-sonnet, Claude-3-5-haiku\n- **Google**: Gemini-2.5-flash, Gemini-2.5-pro, Gemini-2.0-flash-thinking\n- **DeepSeek**: DeepSeek-R1, DeepSeek-V3\n- **xAI**: Grok-4, Grok-3-beta, Grok-3-mini-beta\n- **Local models**: Via Ollama (Llama3.3, Qwen2.5, DeepSeek-Coder, etc.)\n\n### Options\n\n- `--model` / `-m`: Choose the LLM model\n- `--image` / `-i`: Include image files for vision models\n- `--context` / `-c`: Add context from a file\n- `--max-tokens` / `-t`: Set maximum response length\n- `--temperature` / `-temp`: Control response creativity (0.0-1.0)\n- `--think`: Show reasoning process (for reasoning models)\n- `--debug` / `-d`: Enable debug mode\n- `--raw` / `--md`: Output raw markdown without Rich formatting\n\n## Features\n\n- \ud83c\udfa8 Beautiful markdown-formatted responses\n- \ud83d\uddbc\ufe0f Image input support for compatible models\n- \ud83d\udcc1 Context file support\n- \ud83e\udde0 Reasoning model support (DeepSeek, OpenAI o1, etc.)\n- \ud83d\udd27 Extensive model support across providers\n- \u26a1 Fast and lightweight\n- \ud83d\udee0\ufe0f Easy configuration\n\n## Links\n\n- [PyPI Package](https://pypi.org/project/streamlm/)\n- [Homebrew Tap](https://github.com/jeffmylife/homebrew-streamlm)\n- [Issues](https://github.com/jeffmylife/streamlm/issues)\n\n## License\n\nMIT License - see [LICENSE](LICENSE) file for details.\n\n## Development\n\n```bash\n# Make your changes\nuv version --bump patch\ngit add .\ngit commit -m \"feat: your changes\"\ngit push\n\n# Create GitHub release (this triggers everything automatically)\ngh release create v0.1.4 --generate-notes\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A command-line interface for interacting with various Large Language Models with streaming markdown output",
    "version": "0.1.10",
    "project_urls": {
        "Changelog": "https://github.com/jeffmylife/streamlm/releases",
        "Homepage": "https://github.com/jeffmylife/streamlm",
        "Issues": "https://github.com/jeffmylife/streamlm/issues",
        "Repository": "https://github.com/jeffmylife/streamlm"
    },
    "split_keywords": [
        "ai",
        " anthropic",
        " cli",
        " deepseek",
        " gemini",
        " llm",
        " markdown",
        " ollama",
        " openai",
        " streaming",
        " xai"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "fb81c3b1ab6c90c3eabbb7def27ef6d5aeb13c916a1a308c4839b0f5840ea5ca",
                "md5": "63cbeb2199dea507ac3288d51217ff43",
                "sha256": "031a917e9b1d8873cab40ff7d7617685df8b9c8caf410e784ca3a4b6881cce5c"
            },
            "downloads": -1,
            "filename": "streamlm-0.1.10-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "63cbeb2199dea507ac3288d51217ff43",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 20836,
            "upload_time": "2025-07-11T20:19:27",
            "upload_time_iso_8601": "2025-07-11T20:19:27.495752Z",
            "url": "https://files.pythonhosted.org/packages/fb/81/c3b1ab6c90c3eabbb7def27ef6d5aeb13c916a1a308c4839b0f5840ea5ca/streamlm-0.1.10-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f2ecc6dde20a705ce6f923b09acd5a2a4e4fabdf9e047fc873b2bcafcb61473f",
                "md5": "227cdf41626bcbc457ccfb28f6b1472f",
                "sha256": "b8b23dc2de23d02e849391ee8506e46a76ab80df27d0f50c026568499ef83a7a"
            },
            "downloads": -1,
            "filename": "streamlm-0.1.10.tar.gz",
            "has_sig": false,
            "md5_digest": "227cdf41626bcbc457ccfb28f6b1472f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 129734,
            "upload_time": "2025-07-11T20:19:29",
            "upload_time_iso_8601": "2025-07-11T20:19:29.511751Z",
            "url": "https://files.pythonhosted.org/packages/f2/ec/c6dde20a705ce6f923b09acd5a2a4e4fabdf9e047fc873b2bcafcb61473f/streamlm-0.1.10.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-11 20:19:29",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "jeffmylife",
    "github_project": "streamlm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "streamlm"
}
        
Elapsed time: 1.34140s