keepa-crawler


Namekeepa-crawler JSON
Version 1.0.0 PyPI version JSON
download
home_pagehttps://github.com/stanvanrooy/keepa_crawler
SummaryA client to crawl Keepa's historical Amazon product data
upload_time2025-01-26 12:13:04
maintainerNone
docs_urlNone
authorStan van Rooy
requires_python>=3.7
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # keepa_crawler

A Python client for interacting with Keepa's WebSocket API to retrieve historical Amazon product data.

## Installation

Install from PyPI:

```bash
pip install keepa_crawler
```

## Usage

```python
from keepa_crawler import KeepaClient

# Initialize the client
client = KeepaClient()

# Example 1: Retrieve historical prices for a specific ASIN
try:
    data = client.get_historical_prices(asin="B08N5WRWNW")
    print("Historical Prices:", data)
except Exception as e:
    print(f"Error retrieving data: {e}")

# Example 2: Handling multiple ASINs
asins = ["B08N5WRWNW", "B07XJ8C8F5", "B09FGT1JQC"]
for asin in asins:
    try:
        data = client.get_historical_prices(asin=asin)
        print(f"Data for {asin}: {data}")
    except Exception as e:
        print(f"Error retrieving data for {asin}: {e}")

# Example 3: Handling a timeout error
try:
    data = client.get_historical_prices(asin="B08N5WRWNW", timeout=5)
    print("Historical Prices:", data)
except Exception as e:
    print(f"Timeout or other error occurred: {e}")

# Clean up the client
client.close()
```

## Requirements

* Python 3.8 or newer
* Dependencies are installed automatically via `pip`

## License

This project is licensed under the MIT License. See the `LICENSE` file for details.


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/stanvanrooy/keepa_crawler",
    "name": "keepa-crawler",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": null,
    "author": "Stan van Rooy",
    "author_email": "stanvanrooy6@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/d3/b3/0a0e621b65f7167aa640875c567994ed844d2c61dbe75445eb141ccfce1f/keepa_crawler-1.0.0.tar.gz",
    "platform": null,
    "description": "# keepa_crawler\n\nA Python client for interacting with Keepa's WebSocket API to retrieve historical Amazon product data.\n\n## Installation\n\nInstall from PyPI:\n\n```bash\npip install keepa_crawler\n```\n\n## Usage\n\n```python\nfrom keepa_crawler import KeepaClient\n\n# Initialize the client\nclient = KeepaClient()\n\n# Example 1: Retrieve historical prices for a specific ASIN\ntry:\n    data = client.get_historical_prices(asin=\"B08N5WRWNW\")\n    print(\"Historical Prices:\", data)\nexcept Exception as e:\n    print(f\"Error retrieving data: {e}\")\n\n# Example 2: Handling multiple ASINs\nasins = [\"B08N5WRWNW\", \"B07XJ8C8F5\", \"B09FGT1JQC\"]\nfor asin in asins:\n    try:\n        data = client.get_historical_prices(asin=asin)\n        print(f\"Data for {asin}: {data}\")\n    except Exception as e:\n        print(f\"Error retrieving data for {asin}: {e}\")\n\n# Example 3: Handling a timeout error\ntry:\n    data = client.get_historical_prices(asin=\"B08N5WRWNW\", timeout=5)\n    print(\"Historical Prices:\", data)\nexcept Exception as e:\n    print(f\"Timeout or other error occurred: {e}\")\n\n# Clean up the client\nclient.close()\n```\n\n## Requirements\n\n* Python 3.8 or newer\n* Dependencies are installed automatically via `pip`\n\n## License\n\nThis project is licensed under the MIT License. See the `LICENSE` file for details.\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A client to crawl Keepa's historical Amazon product data",
    "version": "1.0.0",
    "project_urls": {
        "Homepage": "https://github.com/stanvanrooy/keepa_crawler"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "68fe52ba984f89bcc492d54b5afbfcf46218eb7bf31d19e83c236a69f9f0de9a",
                "md5": "ec8c076d26642b6e55c85adb423cb25e",
                "sha256": "2819a937d8161266483b2ba70d34ff48a2ed39d21629f0c3fbc5b4b9d53fbdf7"
            },
            "downloads": -1,
            "filename": "keepa_crawler-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ec8c076d26642b6e55c85adb423cb25e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 5778,
            "upload_time": "2025-01-26T12:13:02",
            "upload_time_iso_8601": "2025-01-26T12:13:02.067033Z",
            "url": "https://files.pythonhosted.org/packages/68/fe/52ba984f89bcc492d54b5afbfcf46218eb7bf31d19e83c236a69f9f0de9a/keepa_crawler-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d3b30a0e621b65f7167aa640875c567994ed844d2c61dbe75445eb141ccfce1f",
                "md5": "76b829fed7415f9a53d84513a26de880",
                "sha256": "01af04c527687f4cf15151fa817a1f5ea7c588624f9969ddec511202b7cd5b6a"
            },
            "downloads": -1,
            "filename": "keepa_crawler-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "76b829fed7415f9a53d84513a26de880",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 5361,
            "upload_time": "2025-01-26T12:13:04",
            "upload_time_iso_8601": "2025-01-26T12:13:04.419621Z",
            "url": "https://files.pythonhosted.org/packages/d3/b3/0a0e621b65f7167aa640875c567994ed844d2c61dbe75445eb141ccfce1f/keepa_crawler-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-01-26 12:13:04",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "stanvanrooy",
    "github_project": "keepa_crawler",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "keepa-crawler"
}
        
Elapsed time: 0.48338s