# ScrapingApi Python Client
This is a placeholder for a future python ScrapingApi [web scraping API](https://scrapingapi.net) client.
Get in touch with us at [contact@scrapingapi.net](mailto:contact@scrapingapi.net) if you need this client.
## Installation
Signup at [scrapingapi.net](https://scrapingapi.net) to get your API key.
```
pip install scrapingapi
```
## Example
```python
# Python client not available yet.
```
## Parameters
Check the [ScrapingApi documentation](https://scrapingapi.net/documentation) for more details.
Raw data
{
"_id": null,
"home_page": "https://scrapingapi.net/",
"name": "scrapingapi",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "scraping,proxy,phantomjs,scraping,website,headless,chrome,render,page,webkit",
"author": "Timoth\u00e9e Jeannin",
"author_email": "tjeannin@scrapingapi.net",
"download_url": "https://files.pythonhosted.org/packages/86/d4/190687b11335b87949617a7b72ef27f22bc6a4ad19c71715fdda41fd1953/scrapingapi-0.1.4.tar.gz",
"platform": null,
"description": "# ScrapingApi Python Client\n\nThis is a placeholder for a future python ScrapingApi [web scraping API](https://scrapingapi.net) client. \nGet in touch with us at [contact@scrapingapi.net](mailto:contact@scrapingapi.net) if you need this client. \n\n## Installation\n\nSignup at [scrapingapi.net](https://scrapingapi.net) to get your API key.\n\n```\npip install scrapingapi\n```\n\n## Example\n\n```python\n# Python client not available yet.\n```\n\n## Parameters\n\nCheck the [ScrapingApi documentation](https://scrapingapi.net/documentation) for more details.\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "The official python client of ScrapingApi, website scraping API.",
"version": "0.1.4",
"project_urls": {
"Code": "https://github.com/ScrapingApi/python-client",
"Documentation": "https://scrapingapi.net/documentation",
"Homepage": "https://scrapingapi.net/",
"Issue tracker": "https://github.com/ScrapingApi/python-client/issues"
},
"split_keywords": [
"scraping",
"proxy",
"phantomjs",
"scraping",
"website",
"headless",
"chrome",
"render",
"page",
"webkit"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "55231520bae52b7d8203110de2d52d3a70f4e4b192b3fb51bb96978e29a72f12",
"md5": "0b2e40d8527e3a9cd647380757e9fa97",
"sha256": "6fafd80e35f6598ae1afa903ab284a250cf697c6f7d617cbd324a88c9500fc41"
},
"downloads": -1,
"filename": "scrapingapi-0.1.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "0b2e40d8527e3a9cd647380757e9fa97",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 2652,
"upload_time": "2023-06-27T07:54:00",
"upload_time_iso_8601": "2023-06-27T07:54:00.163470Z",
"url": "https://files.pythonhosted.org/packages/55/23/1520bae52b7d8203110de2d52d3a70f4e4b192b3fb51bb96978e29a72f12/scrapingapi-0.1.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "86d4190687b11335b87949617a7b72ef27f22bc6a4ad19c71715fdda41fd1953",
"md5": "e252f7048331e06fabdd80ca923ad6d1",
"sha256": "5831da9b48525bafd7da233e12a3db51eed443b69b1b93717bea89654b814943"
},
"downloads": -1,
"filename": "scrapingapi-0.1.4.tar.gz",
"has_sig": false,
"md5_digest": "e252f7048331e06fabdd80ca923ad6d1",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 2736,
"upload_time": "2023-06-27T07:54:02",
"upload_time_iso_8601": "2023-06-27T07:54:02.093652Z",
"url": "https://files.pythonhosted.org/packages/86/d4/190687b11335b87949617a7b72ef27f22bc6a4ad19c71715fdda41fd1953/scrapingapi-0.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-06-27 07:54:02",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ScrapingApi",
"github_project": "python-client",
"github_not_found": true,
"lcname": "scrapingapi"
}