# 来自
> https://github.com/shengchenyang/AyugeSpiderTools/blob/master/docs//docs/intro/install.md
>
> 增加个人使用的模板
## 安装
> `python 3.8+` 可以直接输入以下命令:
```shell
pip install gzspidertools
```
> 可选安装1,安装数据库相关的所有依赖:
```shell
pip install gzspidertools[database]
```
> 可选安装2,通过以下命令安装所有依赖:
```shell
pip install gzspidertools[all]
```
*注:详细的安装介绍请查看[安装指南](https://ayugespidertools.readthedocs.io/en/latest/intro/install.html)。*
## 用法
```shell
# 查看库版本
ayuge version
# 创建项目
ayuge startproject <project_name>
# 进入项目根目录
cd <project_name>
# 替换(覆盖)为真实的配置 .conf 文件:
# 这里是为了演示方便,正常情况是直接在 VIT 中的 .conf 文件填上你需要的配置即可
cp /root/mytemp/.conf DemoSpider/VIT/.conf
# 生成爬虫脚本
ayuge genspider <spider_name> <example.com>
# 运行脚本
scrapy crawl <spider_name>
# 注:也可以使用 ayuge crawl <spider_name>
```
Raw data
{
"_id": null,
"home_page": "https://github.com/georgeJzzz/gzspidertools",
"name": "gzSpiderTools",
"maintainer": "George",
"docs_url": null,
"requires_python": ">=3.8.1,<4.0.0",
"maintainer_email": "persistencehoo@gmail.com",
"keywords": "crawler,scraping,aiohttp,asyncio,scrapy",
"author": "George",
"author_email": "persistencehoo@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/f9/87/18576f6208f2bbcfbb40394a6aa2b37b24e9e4a4699fc65e28b603612bf9/gzspidertools-0.0.2.tar.gz",
"platform": null,
"description": "# \u6765\u81ea\n> https://github.com/shengchenyang/AyugeSpiderTools/blob/master/docs//docs/intro/install.md\n> \n> \u589e\u52a0\u4e2a\u4eba\u4f7f\u7528\u7684\u6a21\u677f\n\n## \u5b89\u88c5\n\n> `python 3.8+` \u53ef\u4ee5\u76f4\u63a5\u8f93\u5165\u4ee5\u4e0b\u547d\u4ee4\uff1a\n\n```shell\npip install gzspidertools\n```\n\n> \u53ef\u9009\u5b89\u88c51\uff0c\u5b89\u88c5\u6570\u636e\u5e93\u76f8\u5173\u7684\u6240\u6709\u4f9d\u8d56\uff1a\n\n```shell\npip install gzspidertools[database]\n```\n\n> \u53ef\u9009\u5b89\u88c52\uff0c\u901a\u8fc7\u4ee5\u4e0b\u547d\u4ee4\u5b89\u88c5\u6240\u6709\u4f9d\u8d56\uff1a\n\n```shell\npip install gzspidertools[all]\n```\n\n*\u6ce8\uff1a\u8be6\u7ec6\u7684\u5b89\u88c5\u4ecb\u7ecd\u8bf7\u67e5\u770b[\u5b89\u88c5\u6307\u5357](https://ayugespidertools.readthedocs.io/en/latest/intro/install.html)\u3002*\n\n## \u7528\u6cd5\n```shell\n# \u67e5\u770b\u5e93\u7248\u672c\nayuge version\n\n# \u521b\u5efa\u9879\u76ee\nayuge startproject <project_name>\n\n# \u8fdb\u5165\u9879\u76ee\u6839\u76ee\u5f55\ncd <project_name>\n\n# \u66ff\u6362(\u8986\u76d6)\u4e3a\u771f\u5b9e\u7684\u914d\u7f6e .conf \u6587\u4ef6\uff1a\n# \u8fd9\u91cc\u662f\u4e3a\u4e86\u6f14\u793a\u65b9\u4fbf\uff0c\u6b63\u5e38\u60c5\u51b5\u662f\u76f4\u63a5\u5728 VIT \u4e2d\u7684 .conf \u6587\u4ef6\u586b\u4e0a\u4f60\u9700\u8981\u7684\u914d\u7f6e\u5373\u53ef\ncp /root/mytemp/.conf DemoSpider/VIT/.conf\n\n# \u751f\u6210\u722c\u866b\u811a\u672c\nayuge genspider <spider_name> <example.com>\n\n# \u8fd0\u884c\u811a\u672c\nscrapy crawl <spider_name>\n# \u6ce8\uff1a\u4e5f\u53ef\u4ee5\u4f7f\u7528 ayuge crawl <spider_name>\n```\n\n",
"bugtrack_url": null,
"license": "",
"summary": "scrapy \u6269\u5c55\u5e93\uff1a\u7528\u4e8e\u6269\u5c55 Scrapy \u529f\u80fd\u6765\u89e3\u653e\u53cc\u624b\u3002",
"version": "0.0.2",
"project_urls": {
"Homepage": "https://github.com/georgeJzzz/gzspidertools",
"Repository": "https://github.com/georgeJzzz/gzspidertools"
},
"split_keywords": [
"crawler",
"scraping",
"aiohttp",
"asyncio",
"scrapy"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "900792a3b0c087ba2a9c921f9df08d01f30006caefee14eca1fb386b74a524a6",
"md5": "903155f85d5ad708b6e5575502ca7dfd",
"sha256": "9feef63adc009c3428a2efb5e0b8287469e284a29ef6f5b79463099eadfb1331"
},
"downloads": -1,
"filename": "gzspidertools-0.0.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "903155f85d5ad708b6e5575502ca7dfd",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8.1,<4.0.0",
"size": 104395,
"upload_time": "2024-02-21T15:52:59",
"upload_time_iso_8601": "2024-02-21T15:52:59.773721Z",
"url": "https://files.pythonhosted.org/packages/90/07/92a3b0c087ba2a9c921f9df08d01f30006caefee14eca1fb386b74a524a6/gzspidertools-0.0.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "f98718576f6208f2bbcfbb40394a6aa2b37b24e9e4a4699fc65e28b603612bf9",
"md5": "49e8fad0d45744b9cb77c408f4c24028",
"sha256": "86c95ef0c6ba8fb7952fbed41b1a3b09a6b250a05a994943780f0bb7da58a5ef"
},
"downloads": -1,
"filename": "gzspidertools-0.0.2.tar.gz",
"has_sig": false,
"md5_digest": "49e8fad0d45744b9cb77c408f4c24028",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8.1,<4.0.0",
"size": 71218,
"upload_time": "2024-02-21T15:53:02",
"upload_time_iso_8601": "2024-02-21T15:53:02.526679Z",
"url": "https://files.pythonhosted.org/packages/f9/87/18576f6208f2bbcfbb40394a6aa2b37b24e9e4a4699fc65e28b603612bf9/gzspidertools-0.0.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-02-21 15:53:02",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "georgeJzzz",
"github_project": "gzspidertools",
"github_not_found": true,
"lcname": "gzspidertools"
}