# Pinterest Crawler
[](https://github.com/SajjadAemmi/Pinterest-Crawler/actions/workflows/python-publish.yml)
[](https://github.com/SajjadAemmi/Pinterest-Crawler/actions/workflows/python-package.yml)
[](https://github.com/SajjadAemmi/Pinterest-Crawler/actions/workflows/python-app.yml)
[](https://pepy.tech/project/pinterest-crawler)
<img src="https://raw.githubusercontent.com/SajjadAemmi/Pinterest-Crawler/main/Pinterest-Logo.png" width="400px">
Downloads HD images from pinterest by your favorite keywords. A useful tool to create a dataset for machine learning projects.
## Install
Install the package with pip in a Python>=3.8 environment:
```bash
pip install pinterest-crawler
```
## Usage
### CLI
Pinterest Crawler may be used directly in the Command Line Interface (CLI):
```bash
pinterest-crawler --keywords lion tiger bear
```
Also you can write your favorite keywords in a file for example `my_keywords.txt` and set path of file in `--keywords` argument:
```bash
pinterest-crawler --keywords my_keywords.txt
```
### Python
Pinterest Crawler may also be used directly in a Python environment, and accepts the same arguments as in the CLI example above:
```python
from pinterest_crawler import PinterestCrawler
pinterest_crawler = PinterestCrawler()
pinterest_crawler(keywords=['lion', 'programmer'])
```
<!-- Due to some limitations of Pinterest, you can download 100 images per keyword. If you want to download more images, you can run following command for infinite execution:
```
python loop.py
``` -->
## TODO
- [ ] download images in a loop
- [ ] download images in a specific size
- [ ] download images in a specific format
Raw data
{
"_id": null,
"home_page": "https://github.com/SajjadAemmi/Pinterest-Crawler",
"name": "pinterest-crawler",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": null,
"author": "Sajjad Aemmi",
"author_email": "sajjadaemmi@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/db/20/90f0f7e4e671313732ed0b9f30aa3d8c5bc090ea9b3f77d40fddbcc5f994/pinterest_crawler-0.2.0.tar.gz",
"platform": null,
"description": "# Pinterest Crawler\n\n[](https://github.com/SajjadAemmi/Pinterest-Crawler/actions/workflows/python-publish.yml)\n[](https://github.com/SajjadAemmi/Pinterest-Crawler/actions/workflows/python-package.yml)\n[](https://github.com/SajjadAemmi/Pinterest-Crawler/actions/workflows/python-app.yml)\n[](https://pepy.tech/project/pinterest-crawler)\n\n<img src=\"https://raw.githubusercontent.com/SajjadAemmi/Pinterest-Crawler/main/Pinterest-Logo.png\" width=\"400px\">\n\nDownloads HD images from pinterest by your favorite keywords. A useful tool to create a dataset for machine learning projects.\n\n## Install\n\nInstall the package with pip in a Python>=3.8 environment:\n\n```bash\npip install pinterest-crawler\n```\n\n## Usage\n\n### CLI\n\nPinterest Crawler may be used directly in the Command Line Interface (CLI):\n\n```bash\npinterest-crawler --keywords lion tiger bear\n```\n\nAlso you can write your favorite keywords in a file for example `my_keywords.txt` and set path of file in `--keywords` argument:\n\n```bash\npinterest-crawler --keywords my_keywords.txt\n```\n\n### Python\n\nPinterest Crawler may also be used directly in a Python environment, and accepts the same arguments as in the CLI example above:\n\n```python\nfrom pinterest_crawler import PinterestCrawler\n\n\npinterest_crawler = PinterestCrawler()\npinterest_crawler(keywords=['lion', 'programmer'])\n```\n\n<!-- Due to some limitations of Pinterest, you can download 100 images per keyword. If you want to download more images, you can run following command for infinite execution:\n\n```\npython loop.py\n``` -->\n\n## TODO\n- [ ] download images in a loop\n- [ ] download images in a specific size\n- [ ] download images in a specific format\n",
"bugtrack_url": null,
"license": null,
"summary": "Pinterest Crawler: Download as many images as you want about the searched words",
"version": "0.2.0",
"project_urls": {
"Homepage": "https://github.com/SajjadAemmi/Pinterest-Crawler"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ffdb8db06d0ceb0b33021266b45a0f85bf25dcf5753aef6897f6fa0da6be50de",
"md5": "8ef703d5d00059d066492a2c5f8c111d",
"sha256": "83df635ae8af1ade01965e03433a2ca97d425696007f91660a2604456ee8397d"
},
"downloads": -1,
"filename": "pinterest_crawler-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "8ef703d5d00059d066492a2c5f8c111d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 18316,
"upload_time": "2024-05-21T09:24:54",
"upload_time_iso_8601": "2024-05-21T09:24:54.863305Z",
"url": "https://files.pythonhosted.org/packages/ff/db/8db06d0ceb0b33021266b45a0f85bf25dcf5753aef6897f6fa0da6be50de/pinterest_crawler-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "db2090f0f7e4e671313732ed0b9f30aa3d8c5bc090ea9b3f77d40fddbcc5f994",
"md5": "dd5c9a793ca456c4fbe1a608f2aee263",
"sha256": "9390d2cfe1086048aaf3f7648274edd6cc4465c1b505f078a1fcbc1bb3fd5d17"
},
"downloads": -1,
"filename": "pinterest_crawler-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "dd5c9a793ca456c4fbe1a608f2aee263",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 17283,
"upload_time": "2024-05-21T09:24:56",
"upload_time_iso_8601": "2024-05-21T09:24:56.675214Z",
"url": "https://files.pythonhosted.org/packages/db/20/90f0f7e4e671313732ed0b9f30aa3d8c5bc090ea9b3f77d40fddbcc5f994/pinterest_crawler-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-05-21 09:24:56",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "SajjadAemmi",
"github_project": "Pinterest-Crawler",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "requests",
"specs": []
}
],
"lcname": "pinterest-crawler"
}