llama-cloud


Namellama-cloud JSON
Version 0.1.41 PyPI version JSON
download
home_pageNone
SummaryNone
upload_time2025-09-05 22:45:13
maintainerNone
docs_urlNone
authorLogan Markewich
requires_python<4,>=3.8
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex Python Client

This client is auto-generated using [Fern](https://buildwithfern.com/docs/intro)

## Setup

Before testing or releasing, ensure you have the development environment set up:

```bash
# Install Poetry (if not already installed)
curl -sSL https://install.python-poetry.org | python3 -

# Install project dependencies
poetry install

# Verify setup
poetry --version
python --version
```

## Quick Start

```bash
# Set up development environment (first time only)
make setup

# See all available commands
make help

# Test before releasing
make test

# Complete release workflow  
make release
```

## Commands

| Command | Description |
|---------|-------------|
| `make setup` | Set up development environment (first time only) |
| `make help` | Show all available commands |
| `make build` | Build the package |
| `make test` | Build, install locally, and run comprehensive tests |
| `make publish` | Test and publish to PyPI (with confirmation) |
| `make release` | Complete release workflow (clean → build → test → publish) |
| `make clean` | Clean build artifacts |
| `make setup-credentials` | Set up PyPI credentials (one-time setup) |

## Release Process

**First time setup:**
```bash
make setup  # Set up development environment
```

**For each release:**
1. **Update version** in `pyproject.toml`
2. **Run release workflow:**
   ```bash
   make release
   ```

That's it! The Makefile handles building, testing, and publishing with safety checks.

## Manual Steps (if needed)

<details>
<summary>Expand for manual commands</summary>

**Build and test manually:**
```bash
poetry build
pip install --force-reinstall dist/llama_cloud-*-py3-none-any.whl
python test_local_build.py
```

**Setup credentials manually:**
```bash
poetry config pypi-token.pypi <your-token>
```
Get token from [1Password](https://start.1password.com/open/i?a=32SA66TZ3JCRXOCMASLSDCT5TI&v=lhv7hvb5o46cwo257c3hviqkle&i=yvslwei7jtf6tgqamzcdantqi4&h=llamaindex.1password.com)

</details>

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-cloud",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4,>=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": "Logan Markewich",
    "author_email": "logan@runllama.ai",
    "download_url": "https://files.pythonhosted.org/packages/62/6c/b2e84eebed376aea34c446cab745da5fc4e9dc53309180672299083219d5/llama_cloud-0.1.41.tar.gz",
    "platform": null,
    "description": "# LlamaIndex Python Client\n\nThis client is auto-generated using [Fern](https://buildwithfern.com/docs/intro)\n\n## Setup\n\nBefore testing or releasing, ensure you have the development environment set up:\n\n```bash\n# Install Poetry (if not already installed)\ncurl -sSL https://install.python-poetry.org | python3 -\n\n# Install project dependencies\npoetry install\n\n# Verify setup\npoetry --version\npython --version\n```\n\n## Quick Start\n\n```bash\n# Set up development environment (first time only)\nmake setup\n\n# See all available commands\nmake help\n\n# Test before releasing\nmake test\n\n# Complete release workflow  \nmake release\n```\n\n## Commands\n\n| Command | Description |\n|---------|-------------|\n| `make setup` | Set up development environment (first time only) |\n| `make help` | Show all available commands |\n| `make build` | Build the package |\n| `make test` | Build, install locally, and run comprehensive tests |\n| `make publish` | Test and publish to PyPI (with confirmation) |\n| `make release` | Complete release workflow (clean \u2192 build \u2192 test \u2192 publish) |\n| `make clean` | Clean build artifacts |\n| `make setup-credentials` | Set up PyPI credentials (one-time setup) |\n\n## Release Process\n\n**First time setup:**\n```bash\nmake setup  # Set up development environment\n```\n\n**For each release:**\n1. **Update version** in `pyproject.toml`\n2. **Run release workflow:**\n   ```bash\n   make release\n   ```\n\nThat's it! The Makefile handles building, testing, and publishing with safety checks.\n\n## Manual Steps (if needed)\n\n<details>\n<summary>Expand for manual commands</summary>\n\n**Build and test manually:**\n```bash\npoetry build\npip install --force-reinstall dist/llama_cloud-*-py3-none-any.whl\npython test_local_build.py\n```\n\n**Setup credentials manually:**\n```bash\npoetry config pypi-token.pypi <your-token>\n```\nGet token from [1Password](https://start.1password.com/open/i?a=32SA66TZ3JCRXOCMASLSDCT5TI&v=lhv7hvb5o46cwo257c3hviqkle&i=yvslwei7jtf6tgqamzcdantqi4&h=llamaindex.1password.com)\n\n</details>\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": null,
    "version": "0.1.41",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1e4df0af76b389310840ce3483a92560a152025b0eefe4eee0c81102bf3317e6",
                "md5": "24304fbe606d3f51713bf3bda8d2509d",
                "sha256": "c847f288f0d3f4b23f47345088006deae5f2cf3f223ac1819d4c1531e9aaa13e"
            },
            "downloads": -1,
            "filename": "llama_cloud-0.1.41-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "24304fbe606d3f51713bf3bda8d2509d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4,>=3.8",
            "size": 307646,
            "upload_time": "2025-09-05T22:45:11",
            "upload_time_iso_8601": "2025-09-05T22:45:11.597702Z",
            "url": "https://files.pythonhosted.org/packages/1e/4d/f0af76b389310840ce3483a92560a152025b0eefe4eee0c81102bf3317e6/llama_cloud-0.1.41-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "626cb2e84eebed376aea34c446cab745da5fc4e9dc53309180672299083219d5",
                "md5": "ca6b6a3117e318a0b83b2f00a0556a77",
                "sha256": "dcb741b779e3e740cd64928cfffc8ef70ed0e9bae9ef26acbe1d7e32aa737bdc"
            },
            "downloads": -1,
            "filename": "llama_cloud-0.1.41.tar.gz",
            "has_sig": false,
            "md5_digest": "ca6b6a3117e318a0b83b2f00a0556a77",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4,>=3.8",
            "size": 109854,
            "upload_time": "2025-09-05T22:45:13",
            "upload_time_iso_8601": "2025-09-05T22:45:13.069616Z",
            "url": "https://files.pythonhosted.org/packages/62/6c/b2e84eebed376aea34c446cab745da5fc4e9dc53309180672299083219d5/llama_cloud-0.1.41.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-05 22:45:13",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-cloud"
}
        
Elapsed time: 1.86558s