Name | Scrapy JSON |
Version |
2.10.1
JSON |
| download |
home_page | https://scrapy.org |
Summary | A high-level Web Crawling and Web Scraping framework |
upload_time | 2023-08-30 08:42:11 |
maintainer | Pablo Hoffman |
docs_url | None |
author | Scrapy developers |
requires_python | >=3.8 |
license | BSD |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
|
.. image:: https://scrapy.org/img/scrapylogo.png
:target: https://scrapy.org/
======
Scrapy
======
.. image:: https://img.shields.io/pypi/v/Scrapy.svg
:target: https://pypi.python.org/pypi/Scrapy
:alt: PyPI Version
.. image:: https://img.shields.io/pypi/pyversions/Scrapy.svg
:target: https://pypi.python.org/pypi/Scrapy
:alt: Supported Python Versions
.. image:: https://github.com/scrapy/scrapy/workflows/Ubuntu/badge.svg
:target: https://github.com/scrapy/scrapy/actions?query=workflow%3AUbuntu
:alt: Ubuntu
.. image:: https://github.com/scrapy/scrapy/workflows/macOS/badge.svg
:target: https://github.com/scrapy/scrapy/actions?query=workflow%3AmacOS
:alt: macOS
.. image:: https://github.com/scrapy/scrapy/workflows/Windows/badge.svg
:target: https://github.com/scrapy/scrapy/actions?query=workflow%3AWindows
:alt: Windows
.. image:: https://img.shields.io/badge/wheel-yes-brightgreen.svg
:target: https://pypi.python.org/pypi/Scrapy
:alt: Wheel Status
.. image:: https://img.shields.io/codecov/c/github/scrapy/scrapy/master.svg
:target: https://codecov.io/github/scrapy/scrapy?branch=master
:alt: Coverage report
.. image:: https://anaconda.org/conda-forge/scrapy/badges/version.svg
:target: https://anaconda.org/conda-forge/scrapy
:alt: Conda Version
Overview
========
Scrapy is a fast high-level web crawling and web scraping framework, used to
crawl websites and extract structured data from their pages. It can be used for
a wide range of purposes, from data mining to monitoring and automated testing.
Scrapy is maintained by Zyte_ (formerly Scrapinghub) and `many other
contributors`_.
.. _many other contributors: https://github.com/scrapy/scrapy/graphs/contributors
.. _Zyte: https://www.zyte.com/
Check the Scrapy homepage at https://scrapy.org for more information,
including a list of features.
Requirements
============
* Python 3.8+
* Works on Linux, Windows, macOS, BSD
Install
=======
The quick way:
.. code:: bash
pip install scrapy
See the install section in the documentation at
https://docs.scrapy.org/en/latest/intro/install.html for more details.
Documentation
=============
Documentation is available online at https://docs.scrapy.org/ and in the ``docs``
directory.
Releases
========
You can check https://docs.scrapy.org/en/latest/news.html for the release notes.
Community (blog, twitter, mail list, IRC)
=========================================
See https://scrapy.org/community/ for details.
Contributing
============
See https://docs.scrapy.org/en/master/contributing.html for details.
Code of Conduct
---------------
Please note that this project is released with a Contributor `Code of Conduct <https://github.com/scrapy/scrapy/blob/master/CODE_OF_CONDUCT.md>`_.
By participating in this project you agree to abide by its terms.
Please report unacceptable behavior to opensource@zyte.com.
Companies using Scrapy
======================
See https://scrapy.org/companies/ for a list.
Commercial Support
==================
See https://scrapy.org/support/ for details.
Raw data
{
"_id": null,
"home_page": "https://scrapy.org",
"name": "Scrapy",
"maintainer": "Pablo Hoffman",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "pablo@pablohoffman.com",
"keywords": "",
"author": "Scrapy developers",
"author_email": "pablo@pablohoffman.com",
"download_url": "https://files.pythonhosted.org/packages/4d/7f/29176434a645e77ad66d543cd7ebfeb94a7d19991bfaa44bb4dc13ea915c/Scrapy-2.10.1.tar.gz",
"platform": null,
"description": ".. image:: https://scrapy.org/img/scrapylogo.png\n :target: https://scrapy.org/\n \n======\nScrapy\n======\n\n.. image:: https://img.shields.io/pypi/v/Scrapy.svg\n :target: https://pypi.python.org/pypi/Scrapy\n :alt: PyPI Version\n\n.. image:: https://img.shields.io/pypi/pyversions/Scrapy.svg\n :target: https://pypi.python.org/pypi/Scrapy\n :alt: Supported Python Versions\n\n.. image:: https://github.com/scrapy/scrapy/workflows/Ubuntu/badge.svg\n :target: https://github.com/scrapy/scrapy/actions?query=workflow%3AUbuntu\n :alt: Ubuntu\n\n.. image:: https://github.com/scrapy/scrapy/workflows/macOS/badge.svg\n :target: https://github.com/scrapy/scrapy/actions?query=workflow%3AmacOS\n :alt: macOS\n\n.. image:: https://github.com/scrapy/scrapy/workflows/Windows/badge.svg\n :target: https://github.com/scrapy/scrapy/actions?query=workflow%3AWindows\n :alt: Windows\n\n.. image:: https://img.shields.io/badge/wheel-yes-brightgreen.svg\n :target: https://pypi.python.org/pypi/Scrapy\n :alt: Wheel Status\n\n.. image:: https://img.shields.io/codecov/c/github/scrapy/scrapy/master.svg\n :target: https://codecov.io/github/scrapy/scrapy?branch=master\n :alt: Coverage report\n\n.. image:: https://anaconda.org/conda-forge/scrapy/badges/version.svg\n :target: https://anaconda.org/conda-forge/scrapy\n :alt: Conda Version\n\n\nOverview\n========\n\nScrapy is a fast high-level web crawling and web scraping framework, used to\ncrawl websites and extract structured data from their pages. It can be used for\na wide range of purposes, from data mining to monitoring and automated testing.\n\nScrapy is maintained by Zyte_ (formerly Scrapinghub) and `many other\ncontributors`_.\n\n.. _many other contributors: https://github.com/scrapy/scrapy/graphs/contributors\n.. _Zyte: https://www.zyte.com/\n\nCheck the Scrapy homepage at https://scrapy.org for more information,\nincluding a list of features.\n\n\nRequirements\n============\n\n* Python 3.8+\n* Works on Linux, Windows, macOS, BSD\n\nInstall\n=======\n\nThe quick way:\n\n.. code:: bash\n\n pip install scrapy\n\nSee the install section in the documentation at\nhttps://docs.scrapy.org/en/latest/intro/install.html for more details.\n\nDocumentation\n=============\n\nDocumentation is available online at https://docs.scrapy.org/ and in the ``docs``\ndirectory.\n\nReleases\n========\n\nYou can check https://docs.scrapy.org/en/latest/news.html for the release notes.\n\nCommunity (blog, twitter, mail list, IRC)\n=========================================\n\nSee https://scrapy.org/community/ for details.\n\nContributing\n============\n\nSee https://docs.scrapy.org/en/master/contributing.html for details.\n\nCode of Conduct\n---------------\n\nPlease note that this project is released with a Contributor `Code of Conduct <https://github.com/scrapy/scrapy/blob/master/CODE_OF_CONDUCT.md>`_.\n\nBy participating in this project you agree to abide by its terms.\nPlease report unacceptable behavior to opensource@zyte.com.\n\nCompanies using Scrapy\n======================\n\nSee https://scrapy.org/companies/ for a list.\n\nCommercial Support\n==================\n\nSee https://scrapy.org/support/ for details.\n",
"bugtrack_url": null,
"license": "BSD",
"summary": "A high-level Web Crawling and Web Scraping framework",
"version": "2.10.1",
"project_urls": {
"Documentation": "https://docs.scrapy.org/",
"Homepage": "https://scrapy.org",
"Source": "https://github.com/scrapy/scrapy",
"Tracker": "https://github.com/scrapy/scrapy/issues"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "3e9646dd37c023ff6b3ec96b89c046f3bc7e393a2882309037a27efc700946a2",
"md5": "5aafc74846dcca812fe8964f4f59c72c",
"sha256": "0b2f99d297f32112ae7979f50b9942a5f55b09b821f1c3afb0bed950f6a1f5a7"
},
"downloads": -1,
"filename": "Scrapy-2.10.1-py2.py3-none-any.whl",
"has_sig": false,
"md5_digest": "5aafc74846dcca812fe8964f4f59c72c",
"packagetype": "bdist_wheel",
"python_version": "py2.py3",
"requires_python": ">=3.8",
"size": 281392,
"upload_time": "2023-08-30T08:42:08",
"upload_time_iso_8601": "2023-08-30T08:42:08.935979Z",
"url": "https://files.pythonhosted.org/packages/3e/96/46dd37c023ff6b3ec96b89c046f3bc7e393a2882309037a27efc700946a2/Scrapy-2.10.1-py2.py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "4d7f29176434a645e77ad66d543cd7ebfeb94a7d19991bfaa44bb4dc13ea915c",
"md5": "eec311e7ae4f8d648fa39998de9fd285",
"sha256": "91d67875fbb537607b07e31363445718a3532b544e6e2b4baf8a042b21a1d10f"
},
"downloads": -1,
"filename": "Scrapy-2.10.1.tar.gz",
"has_sig": false,
"md5_digest": "eec311e7ae4f8d648fa39998de9fd285",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 1162241,
"upload_time": "2023-08-30T08:42:11",
"upload_time_iso_8601": "2023-08-30T08:42:11.403980Z",
"url": "https://files.pythonhosted.org/packages/4d/7f/29176434a645e77ad66d543cd7ebfeb94a7d19991bfaa44bb4dc13ea915c/Scrapy-2.10.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-08-30 08:42:11",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "scrapy",
"github_project": "scrapy",
"travis_ci": false,
"coveralls": true,
"github_actions": true,
"tox": true,
"lcname": "scrapy"
}