scrapyd-api


Namescrapyd-api JSON
Version 0.0.3 PyPI version JSON
download
home_pagehttps://github.com/mouday/scrapyd-api
Summaryscrapyd api
upload_time2024-06-07 09:23:40
maintainerNone
docs_urlNone
authorPeng Shiyu
requires_pythonNone
licenseMIT
keywords spider admin
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # scrapyd API

Github: [https://github.com/mouday/scrapyd-api](https://github.com/mouday/scrapyd-api)

Gitee: [https://gitee.com/mouday/scrapyd-api](https://gitee.com/mouday/scrapyd-api)

Pypi: [https://pypi.org/project/scrapyd-api](https://pypi.org/project/scrapyd-api)


## 安装
```bash
pip install scrapyd-api
```

## 使用示例
```python
# -*- coding: utf-8 -*-
from pprint import pprint

from scrapyd_api import ScrapydClient

client = ScrapydClient()
pprint(client.daemon_status())
"""
{'finished': 67,
 'node_name': 'localhost',
 'pending': 0,
 'running': 0,
 'status': 'ok',
 'total': 67}
"""
``` 


## 简介

说明,基于`scrapyd 1.2.1`进行调用,如果版本差异大,可能会出现异常


ScrapydAPI对原有的Scrapyd api进行原样返回,有利于二次开发

接口文档:[https://scrapyd.readthedocs.io/en/stable/api.html](https://scrapyd.readthedocs.io/en/stable/api.html)

```bash
class ScrapydAPI:
    add_version
    cancel
    delete_project
    delete_version
    list_jobs
    list_projects
    list_spiders
    list_versions
    schedule
    daemon_status
```

ScrapydClient类继承自 ScrapydAPI,对其进行了扩展和加强

```bash
class ScrapydClient(ScrapydAPI):
    # 加强的数据接口
    daemon_status             # 增加了返回参数 total
    add_version               # 添加version 默认值为当前时间戳 10位
    list_spiders              # 返回值:列表+字符串 改为 列表+字典
    list_projects             # 返回值:列表+字符串 改为 列表+字典
    list_versions             # 返回值:列表+字符串 改为 列表+字典  

    # 扩展的数据接口
    job_status                # 查询任务状态
    list_versions_format      # 格式化版本号为日期时间格式 '%Y-%m-%d %H:%M:%S'
    list_jobs_merge           # 合并后的任务列表
    cancel_all_project_job    # 取消所有项目下的任务
    cancel_all_job            # 取消指定项目下的任务

    # 扩展的日志接口
    logs                      # 获取日志-项目列表
    project_logs              # 获取日志-爬虫列表
    spider_logs               # 获取日志-任务列表
    job_log                   # 获取job日志
    
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/mouday/scrapyd-api",
    "name": "scrapyd-api",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "spider admin",
    "author": "Peng Shiyu",
    "author_email": "pengshiyuyx@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/ad/8d/0a76549e6cdc5f559c921b17e314aa53d6a0ca621191b20bdf4a0abc4d52/scrapyd_api-0.0.3.tar.gz",
    "platform": null,
    "description": "# scrapyd API\n\nGithub: [https://github.com/mouday/scrapyd-api](https://github.com/mouday/scrapyd-api)\n\nGitee: [https://gitee.com/mouday/scrapyd-api](https://gitee.com/mouday/scrapyd-api)\n\nPypi: [https://pypi.org/project/scrapyd-api](https://pypi.org/project/scrapyd-api)\n\n\n## \u5b89\u88c5\n```bash\npip install scrapyd-api\n```\n\n## \u4f7f\u7528\u793a\u4f8b\n```python\n# -*- coding: utf-8 -*-\nfrom pprint import pprint\n\nfrom scrapyd_api import ScrapydClient\n\nclient = ScrapydClient()\npprint(client.daemon_status())\n\"\"\"\n{'finished': 67,\n 'node_name': 'localhost',\n 'pending': 0,\n 'running': 0,\n 'status': 'ok',\n 'total': 67}\n\"\"\"\n``` \n\n\n## \u7b80\u4ecb\n\n\u8bf4\u660e\uff0c\u57fa\u4e8e`scrapyd 1.2.1`\u8fdb\u884c\u8c03\u7528\uff0c\u5982\u679c\u7248\u672c\u5dee\u5f02\u5927\uff0c\u53ef\u80fd\u4f1a\u51fa\u73b0\u5f02\u5e38\n\n\nScrapydAPI\u5bf9\u539f\u6709\u7684Scrapyd api\u8fdb\u884c\u539f\u6837\u8fd4\u56de\uff0c\u6709\u5229\u4e8e\u4e8c\u6b21\u5f00\u53d1\n\n\u63a5\u53e3\u6587\u6863\uff1a[https://scrapyd.readthedocs.io/en/stable/api.html](https://scrapyd.readthedocs.io/en/stable/api.html)\n\n```bash\nclass ScrapydAPI:\n    add_version\n    cancel\n    delete_project\n    delete_version\n    list_jobs\n    list_projects\n    list_spiders\n    list_versions\n    schedule\n    daemon_status\n```\n\nScrapydClient\u7c7b\u7ee7\u627f\u81ea ScrapydAPI\uff0c\u5bf9\u5176\u8fdb\u884c\u4e86\u6269\u5c55\u548c\u52a0\u5f3a\n\n```bash\nclass ScrapydClient(ScrapydAPI):\n    # \u52a0\u5f3a\u7684\u6570\u636e\u63a5\u53e3\n    daemon_status             # \u589e\u52a0\u4e86\u8fd4\u56de\u53c2\u6570 total\n    add_version               # \u6dfb\u52a0version \u9ed8\u8ba4\u503c\u4e3a\u5f53\u524d\u65f6\u95f4\u6233 10\u4f4d\n    list_spiders              # \u8fd4\u56de\u503c\uff1a\u5217\u8868+\u5b57\u7b26\u4e32 \u6539\u4e3a \u5217\u8868+\u5b57\u5178\n    list_projects             # \u8fd4\u56de\u503c\uff1a\u5217\u8868+\u5b57\u7b26\u4e32 \u6539\u4e3a \u5217\u8868+\u5b57\u5178\n    list_versions             # \u8fd4\u56de\u503c\uff1a\u5217\u8868+\u5b57\u7b26\u4e32 \u6539\u4e3a \u5217\u8868+\u5b57\u5178  \n\n    # \u6269\u5c55\u7684\u6570\u636e\u63a5\u53e3\n    job_status                # \u67e5\u8be2\u4efb\u52a1\u72b6\u6001\n    list_versions_format      # \u683c\u5f0f\u5316\u7248\u672c\u53f7\u4e3a\u65e5\u671f\u65f6\u95f4\u683c\u5f0f '%Y-%m-%d %H:%M:%S'\n    list_jobs_merge           # \u5408\u5e76\u540e\u7684\u4efb\u52a1\u5217\u8868\n    cancel_all_project_job    # \u53d6\u6d88\u6240\u6709\u9879\u76ee\u4e0b\u7684\u4efb\u52a1\n    cancel_all_job            # \u53d6\u6d88\u6307\u5b9a\u9879\u76ee\u4e0b\u7684\u4efb\u52a1\n\n    # \u6269\u5c55\u7684\u65e5\u5fd7\u63a5\u53e3\n    logs                      # \u83b7\u53d6\u65e5\u5fd7-\u9879\u76ee\u5217\u8868\n    project_logs              # \u83b7\u53d6\u65e5\u5fd7-\u722c\u866b\u5217\u8868\n    spider_logs               # \u83b7\u53d6\u65e5\u5fd7-\u4efb\u52a1\u5217\u8868\n    job_log                   # \u83b7\u53d6job\u65e5\u5fd7\n    \n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "scrapyd api",
    "version": "0.0.3",
    "project_urls": {
        "Homepage": "https://github.com/mouday/scrapyd-api"
    },
    "split_keywords": [
        "spider",
        "admin"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a56d631aab9357eab2a49fa50e007eb6c3c64a25d6250c930e90f0efa7866457",
                "md5": "617bb2e31da83d15834445b9d0a41607",
                "sha256": "1f57b7dc3aae427fc27be6e6b0bdf8cd870695347a7304eae37767c8e2fa1bd9"
            },
            "downloads": -1,
            "filename": "scrapyd_api-0.0.3-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "617bb2e31da83d15834445b9d0a41607",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": null,
            "size": 8005,
            "upload_time": "2024-06-07T09:23:38",
            "upload_time_iso_8601": "2024-06-07T09:23:38.619153Z",
            "url": "https://files.pythonhosted.org/packages/a5/6d/631aab9357eab2a49fa50e007eb6c3c64a25d6250c930e90f0efa7866457/scrapyd_api-0.0.3-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ad8d0a76549e6cdc5f559c921b17e314aa53d6a0ca621191b20bdf4a0abc4d52",
                "md5": "5b08e679188fe1d8dd4bcc4b290a22fe",
                "sha256": "c6db916d9f7352f10755d74992f615a59fb269db02d9a79efba2b830b957efba"
            },
            "downloads": -1,
            "filename": "scrapyd_api-0.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "5b08e679188fe1d8dd4bcc4b290a22fe",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 7910,
            "upload_time": "2024-06-07T09:23:40",
            "upload_time_iso_8601": "2024-06-07T09:23:40.221308Z",
            "url": "https://files.pythonhosted.org/packages/ad/8d/0a76549e6cdc5f559c921b17e314aa53d6a0ca621191b20bdf4a0abc4d52/scrapyd_api-0.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-07 09:23:40",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "mouday",
    "github_project": "scrapyd-api",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "scrapyd-api"
}
        
Elapsed time: 0.33403s