===========
scrapy-poet
===========
.. image:: https://img.shields.io/pypi/v/scrapy-poet.svg
:target: https://pypi.python.org/pypi/scrapy-poet
:alt: PyPI Version
.. image:: https://img.shields.io/pypi/pyversions/scrapy-poet.svg
:target: https://pypi.python.org/pypi/scrapy-poet
:alt: Supported Python Versions
.. image:: https://github.com/scrapinghub/scrapy-poet/workflows/tox/badge.svg
:target: https://github.com/scrapinghub/scrapy-poet/actions
:alt: Build Status
.. image:: https://codecov.io/github/scrapinghub/scrapy-poet/coverage.svg?branch=master
:target: https://codecov.io/gh/scrapinghub/scrapy-poet
:alt: Coverage report
.. image:: https://readthedocs.org/projects/scrapy-poet/badge/?version=stable
:target: https://scrapy-poet.readthedocs.io/en/stable/?badge=stable
:alt: Documentation Status
``scrapy-poet`` is the `web-poet`_ Page Object pattern implementation for Scrapy.
``scrapy-poet`` allows to write spiders where extraction logic is separated from the crawling one.
With ``scrapy-poet`` is possible to make a single spider that supports many sites with
different layouts.
Read the `documentation <https://scrapy-poet.readthedocs.io>`_ for more information.
License is BSD 3-clause.
* Documentation: https://scrapy-poet.readthedocs.io
* Source code: https://github.com/scrapinghub/scrapy-poet
* Issue tracker: https://github.com/scrapinghub/scrapy-poet/issues
.. _`web-poet`: https://github.com/scrapinghub/web-poet
Quick Start
***********
Installation
============
.. code-block::
pip install scrapy-poet
Requires **Python 3.8+** and **Scrapy >= 2.6.0**.
Usage in a Scrapy Project
=========================
Add the following inside Scrapy's ``settings.py`` file:
.. code-block:: python
DOWNLOADER_MIDDLEWARES = {
"scrapy_poet.InjectionMiddleware": 543,
"scrapy.downloadermiddlewares.stats.DownloaderStats": None,
"scrapy_poet.DownloaderStatsMiddleware": 850,
}
SPIDER_MIDDLEWARES = {
"scrapy_poet.RetryMiddleware": 275,
}
REQUEST_FINGERPRINTER_CLASS = "scrapy_poet.ScrapyPoetRequestFingerprinter"
Developing
==========
Setup your local Python environment via:
1. `pip install -r requirements-dev.txt`
2. `pre-commit install`
Now everytime you perform a `git commit`, these tools will run against the
staged files:
* `black`
* `isort`
* `flake8`
You can also directly invoke `pre-commit run --all-files` or `tox -e linters`
to run them without performing a commit.
Raw data
{
"_id": null,
"home_page": "https://github.com/scrapinghub/scrapy-poet",
"name": "scrapy-poet",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": null,
"author": "Mikhail Korobov",
"author_email": "kmike84@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/24/1a/58dd5e35cf97ec1bc6902461aa3dab05011d713a9774c3c0f15d2837369d/scrapy_poet-0.24.0.tar.gz",
"platform": null,
"description": "===========\nscrapy-poet\n===========\n\n.. image:: https://img.shields.io/pypi/v/scrapy-poet.svg\n :target: https://pypi.python.org/pypi/scrapy-poet\n :alt: PyPI Version\n\n.. image:: https://img.shields.io/pypi/pyversions/scrapy-poet.svg\n :target: https://pypi.python.org/pypi/scrapy-poet\n :alt: Supported Python Versions\n\n.. image:: https://github.com/scrapinghub/scrapy-poet/workflows/tox/badge.svg\n :target: https://github.com/scrapinghub/scrapy-poet/actions\n :alt: Build Status\n\n.. image:: https://codecov.io/github/scrapinghub/scrapy-poet/coverage.svg?branch=master\n :target: https://codecov.io/gh/scrapinghub/scrapy-poet\n :alt: Coverage report\n\n.. image:: https://readthedocs.org/projects/scrapy-poet/badge/?version=stable\n :target: https://scrapy-poet.readthedocs.io/en/stable/?badge=stable\n :alt: Documentation Status\n\n``scrapy-poet`` is the `web-poet`_ Page Object pattern implementation for Scrapy.\n``scrapy-poet`` allows to write spiders where extraction logic is separated from the crawling one.\nWith ``scrapy-poet`` is possible to make a single spider that supports many sites with\ndifferent layouts.\n\nRead the `documentation <https://scrapy-poet.readthedocs.io>`_ for more information.\n\nLicense is BSD 3-clause.\n\n* Documentation: https://scrapy-poet.readthedocs.io\n* Source code: https://github.com/scrapinghub/scrapy-poet\n* Issue tracker: https://github.com/scrapinghub/scrapy-poet/issues\n\n.. _`web-poet`: https://github.com/scrapinghub/web-poet\n\n\nQuick Start\n***********\n\nInstallation\n============\n\n.. code-block::\n\n pip install scrapy-poet\n\nRequires **Python 3.8+** and **Scrapy >= 2.6.0**.\n\nUsage in a Scrapy Project\n=========================\n\nAdd the following inside Scrapy's ``settings.py`` file:\n\n.. code-block:: python\n\n DOWNLOADER_MIDDLEWARES = {\n \"scrapy_poet.InjectionMiddleware\": 543,\n \"scrapy.downloadermiddlewares.stats.DownloaderStats\": None,\n \"scrapy_poet.DownloaderStatsMiddleware\": 850,\n }\n SPIDER_MIDDLEWARES = {\n \"scrapy_poet.RetryMiddleware\": 275,\n }\n REQUEST_FINGERPRINTER_CLASS = \"scrapy_poet.ScrapyPoetRequestFingerprinter\"\n\nDeveloping\n==========\n\nSetup your local Python environment via:\n\n1. `pip install -r requirements-dev.txt`\n2. `pre-commit install`\n\nNow everytime you perform a `git commit`, these tools will run against the\nstaged files:\n\n* `black`\n* `isort`\n* `flake8`\n\nYou can also directly invoke `pre-commit run --all-files` or `tox -e linters`\nto run them without performing a commit.\n",
"bugtrack_url": null,
"license": null,
"summary": "Page Object pattern for Scrapy",
"version": "0.24.0",
"project_urls": {
"Homepage": "https://github.com/scrapinghub/scrapy-poet"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "908c13b67da14ddd79bd0c4ef3d3a008cd815e0e03f2cfc831d925660b1301f8",
"md5": "e4376c1173d3bcda5fdda7acdca14e18",
"sha256": "b1977d46fcf3e0cc5b3474b696d45cf2d122363a3b8c4aea1a3ec345e81e7ad0"
},
"downloads": -1,
"filename": "scrapy_poet-0.24.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e4376c1173d3bcda5fdda7acdca14e18",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 30213,
"upload_time": "2024-10-10T12:41:52",
"upload_time_iso_8601": "2024-10-10T12:41:52.982484Z",
"url": "https://files.pythonhosted.org/packages/90/8c/13b67da14ddd79bd0c4ef3d3a008cd815e0e03f2cfc831d925660b1301f8/scrapy_poet-0.24.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "241a58dd5e35cf97ec1bc6902461aa3dab05011d713a9774c3c0f15d2837369d",
"md5": "8365c0ef1f0ecb5fe49846f85fcaed50",
"sha256": "4b5aed8930731691ab464683d9d656381bae236ae72870add2398a1aca394a41"
},
"downloads": -1,
"filename": "scrapy_poet-0.24.0.tar.gz",
"has_sig": false,
"md5_digest": "8365c0ef1f0ecb5fe49846f85fcaed50",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 59868,
"upload_time": "2024-10-10T12:41:54",
"upload_time_iso_8601": "2024-10-10T12:41:54.139008Z",
"url": "https://files.pythonhosted.org/packages/24/1a/58dd5e35cf97ec1bc6902461aa3dab05011d713a9774c3c0f15d2837369d/scrapy_poet-0.24.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-10 12:41:54",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "scrapinghub",
"github_project": "scrapy-poet",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"tox": true,
"lcname": "scrapy-poet"
}