===========
scrapy-poet
===========
.. image:: https://img.shields.io/pypi/v/scrapy-poet.svg
:target: https://pypi.python.org/pypi/scrapy-poet
:alt: PyPI Version
.. image:: https://img.shields.io/pypi/pyversions/scrapy-poet.svg
:target: https://pypi.python.org/pypi/scrapy-poet
:alt: Supported Python Versions
.. image:: https://github.com/scrapinghub/scrapy-poet/workflows/tox/badge.svg
:target: https://github.com/scrapinghub/scrapy-poet/actions
:alt: Build Status
.. image:: https://codecov.io/github/scrapinghub/scrapy-poet/coverage.svg?branch=master
:target: https://codecov.io/gh/scrapinghub/scrapy-poet
:alt: Coverage report
.. image:: https://readthedocs.org/projects/scrapy-poet/badge/?version=stable
:target: https://scrapy-poet.readthedocs.io/en/stable/?badge=stable
:alt: Documentation Status
``scrapy-poet`` is the `web-poet`_ Page Object pattern implementation for Scrapy.
``scrapy-poet`` allows to write spiders where extraction logic is separated from the crawling one.
With ``scrapy-poet`` is possible to make a single spider that supports many sites with
different layouts.
Read the `documentation <https://scrapy-poet.readthedocs.io>`_ for more information.
License is BSD 3-clause.
* Documentation: https://scrapy-poet.readthedocs.io
* Source code: https://github.com/scrapinghub/scrapy-poet
* Issue tracker: https://github.com/scrapinghub/scrapy-poet/issues
.. _`web-poet`: https://github.com/scrapinghub/web-poet
Quick Start
***********
Installation
============
.. code-block::
pip install scrapy-poet
Requires **Python 3.8+** and **Scrapy >= 2.6.0**.
Usage in a Scrapy Project
=========================
Add the following inside Scrapy's ``settings.py`` file:
.. code-block:: python
DOWNLOADER_MIDDLEWARES = {
"scrapy_poet.InjectionMiddleware": 543,
"scrapy.downloadermiddlewares.stats.DownloaderStats": None,
"scrapy_poet.DownloaderStatsMiddleware": 850,
}
SPIDER_MIDDLEWARES = {
"scrapy_poet.RetryMiddleware": 275,
}
REQUEST_FINGERPRINTER_CLASS = "scrapy_poet.ScrapyPoetRequestFingerprinter"
Developing
==========
Setup your local Python environment via:
1. `pip install -r requirements-dev.txt`
2. `pre-commit install`
Now everytime you perform a `git commit`, these tools will run against the
staged files:
* `black`
* `isort`
* `flake8`
You can also directly invoke `pre-commit run --all-files` or `tox -e linters`
to run them without performing a commit.
Raw data
{
"_id": null,
"home_page": "https://github.com/scrapinghub/scrapy-poet",
"name": "scrapy-poet",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": null,
"author": "Mikhail Korobov",
"author_email": "kmike84@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/1a/d3/b8e3c4def03452550cc020905efe247aba0ea035c41d2a8980cb95000f0b/scrapy_poet-0.22.3.tar.gz",
"platform": null,
"description": "===========\nscrapy-poet\n===========\n\n.. image:: https://img.shields.io/pypi/v/scrapy-poet.svg\n :target: https://pypi.python.org/pypi/scrapy-poet\n :alt: PyPI Version\n\n.. image:: https://img.shields.io/pypi/pyversions/scrapy-poet.svg\n :target: https://pypi.python.org/pypi/scrapy-poet\n :alt: Supported Python Versions\n\n.. image:: https://github.com/scrapinghub/scrapy-poet/workflows/tox/badge.svg\n :target: https://github.com/scrapinghub/scrapy-poet/actions\n :alt: Build Status\n\n.. image:: https://codecov.io/github/scrapinghub/scrapy-poet/coverage.svg?branch=master\n :target: https://codecov.io/gh/scrapinghub/scrapy-poet\n :alt: Coverage report\n\n.. image:: https://readthedocs.org/projects/scrapy-poet/badge/?version=stable\n :target: https://scrapy-poet.readthedocs.io/en/stable/?badge=stable\n :alt: Documentation Status\n\n``scrapy-poet`` is the `web-poet`_ Page Object pattern implementation for Scrapy.\n``scrapy-poet`` allows to write spiders where extraction logic is separated from the crawling one.\nWith ``scrapy-poet`` is possible to make a single spider that supports many sites with\ndifferent layouts.\n\nRead the `documentation <https://scrapy-poet.readthedocs.io>`_ for more information.\n\nLicense is BSD 3-clause.\n\n* Documentation: https://scrapy-poet.readthedocs.io\n* Source code: https://github.com/scrapinghub/scrapy-poet\n* Issue tracker: https://github.com/scrapinghub/scrapy-poet/issues\n\n.. _`web-poet`: https://github.com/scrapinghub/web-poet\n\n\nQuick Start\n***********\n\nInstallation\n============\n\n.. code-block::\n\n pip install scrapy-poet\n\nRequires **Python 3.8+** and **Scrapy >= 2.6.0**.\n\nUsage in a Scrapy Project\n=========================\n\nAdd the following inside Scrapy's ``settings.py`` file:\n\n.. code-block:: python\n\n DOWNLOADER_MIDDLEWARES = {\n \"scrapy_poet.InjectionMiddleware\": 543,\n \"scrapy.downloadermiddlewares.stats.DownloaderStats\": None,\n \"scrapy_poet.DownloaderStatsMiddleware\": 850,\n }\n SPIDER_MIDDLEWARES = {\n \"scrapy_poet.RetryMiddleware\": 275,\n }\n REQUEST_FINGERPRINTER_CLASS = \"scrapy_poet.ScrapyPoetRequestFingerprinter\"\n\nDeveloping\n==========\n\nSetup your local Python environment via:\n\n1. `pip install -r requirements-dev.txt`\n2. `pre-commit install`\n\nNow everytime you perform a `git commit`, these tools will run against the\nstaged files:\n\n* `black`\n* `isort`\n* `flake8`\n\nYou can also directly invoke `pre-commit run --all-files` or `tox -e linters`\nto run them without performing a commit.\n",
"bugtrack_url": null,
"license": null,
"summary": "Page Object pattern for Scrapy",
"version": "0.22.3",
"project_urls": {
"Homepage": "https://github.com/scrapinghub/scrapy-poet"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "64763df61e1f8c13bb86804eb27b420bb1c3e86423bbd52e493954a5188a9c2b",
"md5": "3e80de39e5475ba4507419ad14c2a6eb",
"sha256": "f7e145050dec0811a622e56b2a69ca4e66a6a940e9ec3876709ca1435fff84f2"
},
"downloads": -1,
"filename": "scrapy_poet-0.22.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "3e80de39e5475ba4507419ad14c2a6eb",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 29257,
"upload_time": "2024-04-25T16:37:21",
"upload_time_iso_8601": "2024-04-25T16:37:21.743932Z",
"url": "https://files.pythonhosted.org/packages/64/76/3df61e1f8c13bb86804eb27b420bb1c3e86423bbd52e493954a5188a9c2b/scrapy_poet-0.22.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "1ad3b8e3c4def03452550cc020905efe247aba0ea035c41d2a8980cb95000f0b",
"md5": "97e9e0337805cfc23810b97ab575438f",
"sha256": "ebef9add38dfd10950c8904e9e342c087ce09201e84b15dfac42fb7c505e4535"
},
"downloads": -1,
"filename": "scrapy_poet-0.22.3.tar.gz",
"has_sig": false,
"md5_digest": "97e9e0337805cfc23810b97ab575438f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 57530,
"upload_time": "2024-04-25T16:37:22",
"upload_time_iso_8601": "2024-04-25T16:37:22.996850Z",
"url": "https://files.pythonhosted.org/packages/1a/d3/b8e3c4def03452550cc020905efe247aba0ea035c41d2a8980cb95000f0b/scrapy_poet-0.22.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-04-25 16:37:22",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "scrapinghub",
"github_project": "scrapy-poet",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"tox": true,
"lcname": "scrapy-poet"
}