Amazon Products Scraper
=======================
.. image:: https://badge.fury.io/py/amazon-scrape.svg
:target: https://badge.fury.io/py/amazon-scrape
:alt: amazon-scrape Python Package Version
Scrape Amazon product data such as Product Name, Product Images, Number of Reviews, Price, Product URL, and ASIN.
Requirements
------------
Python 2.7 and later.
Setup
-----
You can install this package by using the pip tool and installing:
.. code-block:: bash
$ pip install amazon-scrape
Or:
.. code-block:: bash
$ easy_install amazon-scrape
Scraper Help
------------
Execute this command `amazon_scraper --help` in the terminal.
.. code-block:: text
usage: amazon_scraper [-h] [--locale LOCALE] [--keywords KEYWORDS] [--url URL] [--proxy_api_key PROXY_API_KEY] [--pages PAGES] [-r]
optional arguments:
-h, --help show this help message and exit
--locale LOCALE Amazon locale (e.g., "com", "co.uk", "de", etc.)
--keywords KEYWORDS Search keywords
--url URL Amazon URL
--proxy_api_key Scraper API Key
--pages PAGES Number of pages to scrape
-r, --review Scrape reviews
Usage Example
-------------
.. code-block:: python
# Specify locale, keywords, API key, and number of pages to scrape:
amazon_scraper --locale com --keywords "laptop" --proxy_api_key "your_api_key" --pages 10
## Specify only keywords and API key (will default to "co.uk" locale and 20 pages):
amazon_scraper --keywords "iphone" --proxy_api_key "your_api_key"
## Specify a direct Amazon URL and API key (will default to "co.uk" locale and 20 pages):
amazon_scraper --url "https://www.amazon.de/s?k=iphone&crid=1OHYY6U6OGCK5&sprefix=ipho%2Caps%2C335&ref=nb_sb_noss_2" --proxy_api_key "your_api_key"
## Specify locale and Amazon URL (will default to 20 pages):
amazon_scraper --locale de --url "https://www.amazon.de/s?k=iphone&crid=1OHYY6U6OGCK5&sprefix=ipho%2Caps%2C335&ref=nb_sb_noss_2" --proxy_api_key "your_api_key"
## Specify review to scrape product(s) reviews:
amazon_scraper --keywords "watches" --proxy_api_key "your_api_key --review
Create Scraper API Account
--------------------------
Sign up for a Scraper API `user account`_.
.. _user account: https://www.scraperapi.com/?fp_ref=finbarrs11
License
-------
This project is licensed under the `MIT License`_.
.. _MIT License: https://github.com/0xnu/amazonproducts/blob/main/LICENSE
Copyright
---------
Copyright |copy| 2023 `Finbarrs Oketunji`_. All Rights Reserved.
.. |copy| unicode:: 0xA9 .. copyright sign
.. _Finbarrs Oketunji: https://finbarrs.eu
Raw data
{
"_id": null,
"home_page": "https://finbarrs.eu",
"name": "amazon-scrape",
"maintainer": "",
"docs_url": null,
"requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*",
"maintainer_email": "",
"keywords": "amazon,products,ecommerce,price,reviews,ratings",
"author": "Finbarrs Oketunji",
"author_email": "f@finbarrs.eu",
"download_url": "https://files.pythonhosted.org/packages/1b/3c/7b7fc4460731196e591eeaace6313b430539fb68c26925d0a0509b77b188/amazon_scrape-1.0.6.tar.gz",
"platform": null,
"description": "Amazon Products Scraper\n=======================\n\n.. image:: https://badge.fury.io/py/amazon-scrape.svg\n :target: https://badge.fury.io/py/amazon-scrape\n :alt: amazon-scrape Python Package Version\n\nScrape Amazon product data such as Product Name, Product Images, Number of Reviews, Price, Product URL, and ASIN.\n\nRequirements\n------------\n\nPython 2.7 and later.\n\n\nSetup\n-----\n\nYou can install this package by using the pip tool and installing:\n\n.. code-block:: bash\n\n\t$ pip install amazon-scrape\n\nOr:\n\n.. code-block:: bash\n\n\t$ easy_install amazon-scrape\n\n\nScraper Help\n------------\n\nExecute this command `amazon_scraper --help` in the terminal.\n\n.. code-block:: text\n\n usage: amazon_scraper [-h] [--locale LOCALE] [--keywords KEYWORDS] [--url URL] [--proxy_api_key PROXY_API_KEY] [--pages PAGES] [-r]\n\n optional arguments:\n -h, --help show this help message and exit\n --locale LOCALE Amazon locale (e.g., \"com\", \"co.uk\", \"de\", etc.)\n --keywords KEYWORDS Search keywords\n --url URL Amazon URL\n --proxy_api_key Scraper API Key\n --pages PAGES Number of pages to scrape\n -r, --review Scrape reviews\n\n\nUsage Example\n-------------\n\n.. code-block:: python\n\n # Specify locale, keywords, API key, and number of pages to scrape:\n amazon_scraper --locale com --keywords \"laptop\" --proxy_api_key \"your_api_key\" --pages 10\n\n ## Specify only keywords and API key (will default to \"co.uk\" locale and 20 pages):\n amazon_scraper --keywords \"iphone\" --proxy_api_key \"your_api_key\"\n\n ## Specify a direct Amazon URL and API key (will default to \"co.uk\" locale and 20 pages):\n amazon_scraper --url \"https://www.amazon.de/s?k=iphone&crid=1OHYY6U6OGCK5&sprefix=ipho%2Caps%2C335&ref=nb_sb_noss_2\" --proxy_api_key \"your_api_key\"\n\n ## Specify locale and Amazon URL (will default to 20 pages):\n amazon_scraper --locale de --url \"https://www.amazon.de/s?k=iphone&crid=1OHYY6U6OGCK5&sprefix=ipho%2Caps%2C335&ref=nb_sb_noss_2\" --proxy_api_key \"your_api_key\"\n\n ## Specify review to scrape product(s) reviews:\n amazon_scraper --keywords \"watches\" --proxy_api_key \"your_api_key --review\n\n\nCreate Scraper API Account\n--------------------------\n\nSign up for a Scraper API `user account`_.\n\n.. _user account: https://www.scraperapi.com/?fp_ref=finbarrs11\n\n\nLicense\n-------\n\nThis project is licensed under the `MIT License`_. \n\n.. _MIT License: https://github.com/0xnu/amazonproducts/blob/main/LICENSE\n\n\nCopyright\n---------\n\nCopyright |copy| 2023 `Finbarrs Oketunji`_. All Rights Reserved.\n\n.. |copy| unicode:: 0xA9 .. copyright sign\n.. _Finbarrs Oketunji: https://finbarrs.eu\n\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Scrape Amazon product data such as Product Name, Product Images, Number of Reviews, Price, Product URL, and ASIN.",
"version": "1.0.6",
"project_urls": {
"Bug Tracker": "https://github.com/0xnu/amazonproducts/issues",
"Changes": "https://github.com/0xnu/amazonproducts/blob/main/CHANGELOG.md",
"Documentation": "https://github.com/0xnu/amazonproducts/blob/main/README.md",
"Homepage": "https://finbarrs.eu",
"Source Code": "https://github.com/0xnu/amazonproducts"
},
"split_keywords": [
"amazon",
"products",
"ecommerce",
"price",
"reviews",
"ratings"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "d646dee599b161a7c7005692a2e58062186785f53b2666f4fb8d9b9b75aa8550",
"md5": "e61ee1e2cfde78ddfc17e5ac39fc4211",
"sha256": "42d1ddf975667ffd8a86f553e473c68438996acf7fef815282704d4729f364f8"
},
"downloads": -1,
"filename": "amazon_scrape-1.0.6-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e61ee1e2cfde78ddfc17e5ac39fc4211",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*",
"size": 6048,
"upload_time": "2023-07-29T09:24:19",
"upload_time_iso_8601": "2023-07-29T09:24:19.384859Z",
"url": "https://files.pythonhosted.org/packages/d6/46/dee599b161a7c7005692a2e58062186785f53b2666f4fb8d9b9b75aa8550/amazon_scrape-1.0.6-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "1b3c7b7fc4460731196e591eeaace6313b430539fb68c26925d0a0509b77b188",
"md5": "09059436447c13608860998349c3dfd0",
"sha256": "6f065e574c18801f5a4bdb06896588a2dd09a28acdfdb7fa581aa43d93ccfa6a"
},
"downloads": -1,
"filename": "amazon_scrape-1.0.6.tar.gz",
"has_sig": false,
"md5_digest": "09059436447c13608860998349c3dfd0",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*",
"size": 5667,
"upload_time": "2023-07-29T09:24:23",
"upload_time_iso_8601": "2023-07-29T09:24:23.781306Z",
"url": "https://files.pythonhosted.org/packages/1b/3c/7b7fc4460731196e591eeaace6313b430539fb68c26925d0a0509b77b188/amazon_scrape-1.0.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-07-29 09:24:23",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "0xnu",
"github_project": "amazonproducts",
"github_not_found": true,
"lcname": "amazon-scrape"
}