scrapeready


Namescrapeready JSON
Version 0.1.0 PyPI version JSON
download
home_pageNone
SummaryA Python client for the Scrapeready.com v1 API
upload_time2025-01-30 17:50:19
maintainerNone
docs_urlNone
authorNone
requires_python>=3.7
licenseNone
keywords scraping serp ai scrapeready
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Scrapeready Client

A Python client for making requests to [scrapeready.com](https://api.scrapeready.com/v1/) endpoints.

## Features

- **website_to_text(url: str)**  
  Scrapes text from a website.

- **serp(q: str)**  
  Retrieves SERP (Search Engine Results Page) information.

- **ai_search(q: Optional[str], prompt: Optional[str], max_websites: int = 5)**  
  Generates or uses a query to scrape top search results.

## Installation

```bash
pip install scrapeready

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "scrapeready",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "scraping, serp, ai, scrapeready",
    "author": null,
    "author_email": "Your Name <your.email@example.com>",
    "download_url": "https://files.pythonhosted.org/packages/f8/60/e34a3eb793053ce88fcc79d0cfa2854758b44baa4356a880a0e9acc9ab36/scrapeready-0.1.0.tar.gz",
    "platform": null,
    "description": "# Scrapeready Client\n\nA Python client for making requests to [scrapeready.com](https://api.scrapeready.com/v1/) endpoints.\n\n## Features\n\n- **website_to_text(url: str)**  \n  Scrapes text from a website.\n\n- **serp(q: str)**  \n  Retrieves SERP (Search Engine Results Page) information.\n\n- **ai_search(q: Optional[str], prompt: Optional[str], max_websites: int = 5)**  \n  Generates or uses a query to scrape top search results.\n\n## Installation\n\n```bash\npip install scrapeready\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A Python client for the Scrapeready.com v1 API",
    "version": "0.1.0",
    "project_urls": {
        "Homepage": "https://github.com/YOUR_GITHUB/scrapeready",
        "Source": "https://github.com/YOUR_GITHUB/scrapeready",
        "Tracker": "https://github.com/YOUR_GITHUB/scrapeready/issues"
    },
    "split_keywords": [
        "scraping",
        " serp",
        " ai",
        " scrapeready"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "114be4c094fac17268d6d757d468ca6df12f73f315671780b07386ca06f99f30",
                "md5": "6949039eaec236597008ad3f7fc61946",
                "sha256": "a53a37f5b75a74e228b2aa095aa187b22f5d396a60c763743a6c1bb061b414fe"
            },
            "downloads": -1,
            "filename": "scrapeready-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "6949039eaec236597008ad3f7fc61946",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 3271,
            "upload_time": "2025-01-30T17:50:17",
            "upload_time_iso_8601": "2025-01-30T17:50:17.008755Z",
            "url": "https://files.pythonhosted.org/packages/11/4b/e4c094fac17268d6d757d468ca6df12f73f315671780b07386ca06f99f30/scrapeready-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f860e34a3eb793053ce88fcc79d0cfa2854758b44baa4356a880a0e9acc9ab36",
                "md5": "1edff756c7a5256b7458cec033fedaf3",
                "sha256": "91f915db70cc9b047430d6026048d45ccef82f9451de3aed66b738ee38492ba8"
            },
            "downloads": -1,
            "filename": "scrapeready-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "1edff756c7a5256b7458cec033fedaf3",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 3051,
            "upload_time": "2025-01-30T17:50:19",
            "upload_time_iso_8601": "2025-01-30T17:50:19.692943Z",
            "url": "https://files.pythonhosted.org/packages/f8/60/e34a3eb793053ce88fcc79d0cfa2854758b44baa4356a880a0e9acc9ab36/scrapeready-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-01-30 17:50:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "YOUR_GITHUB",
    "github_project": "scrapeready",
    "github_not_found": true,
    "lcname": "scrapeready"
}
        
Elapsed time: 0.41708s