Name | spiders-for-all JSON |
Version |
0.3.9
JSON |
| download |
home_page | |
Summary | |
upload_time | 2024-02-15 07:30:28 |
maintainer | |
docs_url | None |
author | icebear |
requires_python | >=3.12,<4.0 |
license | |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
<p align="center">
<img src="docs/logo.png" height="200px"/>
</p>
> 爬取、下载哔哩哔哩、小红书等网站数据、视频, 持续更新中...
> **Warning:**
>
> 本项目仅供学习交流使用, 请勿用于商业及非法用途, 由此引起的一切后果与作者无关
https://github.com/iiicebearrr/spiders-for-all/assets/110714291/32e098d3-c85e-44c9-b54c-4820365b0235
# Menu
- [Menu](#menu)
- [Quick Preview](#quick-preview)
- [Installation](#installation)
- [Documentation](#documentation)
- [Roadmap](#roadmap)
- [Customization](#customization)
- [Find me](#find-me)
- [Known Issues](#known-issues)
# Quick Preview
**根据用户id爬取b站用户主页投稿视频**
```sh
python -m spiders_for_all bilibili download-by-author -m 用户id -s 保存目录
```
**根据note_id批量爬取小红书笔记内容**
```sh
python -m spiders_for_all xhs download-by-id -i note_id1,note_id2,note_id3 -s 保存目录
```
**更多用法见[Documentation](#documentation)部份**
# Installation
```sh
pip install spiders-for-all # python 版本 >= 3.12
```
# Documentation
**点击进入对应平台的使用文档**
- [哔哩哔哩](./spiders_for_all/spiders/bilibili/README.md)
- [小红书](./spiders_for_all/spiders/xhs/README.md)
# Roadmap
- bilibili
- [x] 综合热门、入站必刷等栏目爬虫
- [x] 根据bvid爬取/批量爬取视频
- [x] 根据用户id爬取用户主页投稿视频
- [x] 爬取用户动态
- xhs
- [x] 根据note_id爬取/批量爬取笔记
- [x] 根据用户id爬取用户主页首页笔记
- [x] 爬取笔记评论
- [ ] GUI
# Customization
**如果你想自定义爬虫, 可以参考以下指引**
首先需要了解一下项目爬虫的基本结构, 参考下图:

- `spiders_for_all/spiders`目录下存放各个平台的爬虫, 每个爬虫都是一个类, 继承自`spiders_for_all.core.spider.BaseSpider`类, 并实现`BaseSpider`类的`get_items_from_response`方法
- 继承自`BaseSpider`的爬虫, 会按照以下流程爬取目标网站数据
- `self.__init__`: 初始化爬虫
- 初始化logger
- 初始化数据库连接
- 初始化数据库表, 这一步的行为取决于`cls.db_action_on_init`:
- `spiders_for_all.core.spider.DbActionOnInit.CREATE_IF_NOT_EXISTS`: 如果表不存在则创建
- `spiders_for_all.core.spider.DbActionOnInit.DROP_AND_CREATE`: 直接删除并创建表
- 初始化client:
- `client`是一个`spiders_for_all.core.client.HttpClient`对象, 封装了失败重试、日志、以及生成模拟请求头`user-agent`的功能
- `self.run`: 运行爬虫
- `self.before`: 运行爬虫前执行的代码, 默认会打印一条日志
- `self.get_items`: 从目标网站获取数据
- `self.request_items`: 向`self.api`发送请求, 获取返回。会调用`self.get_request_args`获取额外的请求参数比如`params`, `data`等等
- `self.validate_response`: 验证返回体结构, 如果定义了`self.response_model`, 会调用`self.response_model`基于`pydantic`对返回体进行验证。**注意这里的验证是业务验证, 不是HTTP状态的验证, `self.client`会自动验证HTTP状态。** 如果没有定义`self.response_model`, 则不会进行验证,直接返回原始的`requests.Response`对象
- `self.get_items_from_response`: 从验证成功的返回体中取出对应的items, 返回items的数据类型完全取决于`cls.item_model`, 具体可以参考几个爬虫的实现
- **需要注意的是, `self.request_items()`, `self.validate_response`, `self.get_items_from_response`均被封装在一个`retry`的wrapper内,这也就意味着如果出现: `HTTP状态码非200`, `业务返回体异常, 比如哔哩哔哩的-352业务码`, `获取items失败,比如网页没有返回正常的数据`这些情况, 均会触发自动重试**
- `self.save_items`: 将爬取的数据保存到数据库
- `self.after`: 运行爬虫后执行的代码, 默认会打印一条日志
- 除此之外, 需要额外定义一些类属性, 示例如下
```python
import typing as t
import logging
import requests
from sqlalchemy import orm
from pydantic import BaseModel
from spiders_for_all.core.response import Response
from spiders_for_all.core.spider import BaseSpider, DbActionOnInit, DbActionOnSave
from spiders_for_all.database.session import SessionManager
from spiders_for_all.utils.logger import default_logger
class MySpider(BaseSpider):
api: str # 目标API, 必须
name: str # 爬虫名称, 必须
alias: str # 爬虫别名, 必须
platform: str # 爬虫所属平台, 必须
description: str = "" # 爬虫描述, 可选
database_model: t.Type[orm.DeclarativeBase] # 数据库表模型, 必须
item_model: t.Type[BaseModel] # 爬取的数据模型, 必须
response_model: t.Type[Response] | None = None # 返回体模型, 可选
logger: logging.Logger = default_logger # 日志对象, 可选
session_manager: SessionManager # 数据库连接管理器, 必须
insert_batch_size: int = 100 # 批量插入数据库时的批量大小, 可选
db_action_on_init: DbActionOnInit = DbActionOnInit.CREATE_IF_NOT_EXIST # 初始化数据库表时的行为, 可选
db_action_on_save: DbActionOnSave = DbActionOnSave.DELETE_AND_CREATE # 保存数据时的行为, 可选
def get_items_from_response(self, response: requests.Response | Response) -> t.Iterable[BaseModel]:
# 从返回体中取出items, 必须
pass
```
# Find me
- [小红书](https://www.xiaohongshu.com/user/profile/653619a2000000000d006d1a)
- [bilibili](https://space.bilibili.com/634824012?spm_id_from=333.1007.0.0)
# Known Issues
- [x] (已解决)小红书爬取用户投稿的笔记时, 由于小红书签名算法的问题尚未解决, 只能爬取用户投稿的首页数据, 需要下拉加载的数据暂时无法爬取(签名算法借鉴了[Spider-XHS](https://github.com/cv-cat/Spider_XHS.git))
- [x] 低版本的sqlite可能不支持`ON CONFLICT DO UPDATE`语法, 如果遇到该问题请尝试升级sqlite版本
- [x] 大批量下载视频时可能偶现下载中断导致失败的情况, 这种情况请在所有任务下载完成后, 运行`grep ERROR logs/bilibili.log`可以查看有哪些视频下载失败, 然后收集bvid重新下载即可。未来会考虑处理这种情况
- [x] (已解决)bilibili近期(2024-01-20左右)更新了用户投稿接口的反爬策略, 新增了`dm_img_inter`参数, 目前作者还没有时间实现该参数的生成算法, 所以该接口暂时无法使用, 但是如果你实现了该参数的生成算法, 可以通过修改`spiders_for_all.spiders.bilibili.AuthorSpider`中的`get_request_args`方法来继续使用该接口
Raw data
{
"_id": null,
"home_page": "",
"name": "spiders-for-all",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.12,<4.0",
"maintainer_email": "",
"keywords": "",
"author": "icebear",
"author_email": "iiiicebeaaaar@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/9c/3e/4bd249769c2afef8a9640744ab1acaf48ed43ac90df651cae168d218496c/spiders_for_all-0.3.9.tar.gz",
"platform": null,
"description": "<p align=\"center\">\n<img src=\"docs/logo.png\" height=\"200px\"/>\n</p>\n\n> \u722c\u53d6\u3001\u4e0b\u8f7d\u54d4\u54e9\u54d4\u54e9\u3001\u5c0f\u7ea2\u4e66\u7b49\u7f51\u7ad9\u6570\u636e\u3001\u89c6\u9891, \u6301\u7eed\u66f4\u65b0\u4e2d...\n\n> **Warning:**\n> \n> \u672c\u9879\u76ee\u4ec5\u4f9b\u5b66\u4e60\u4ea4\u6d41\u4f7f\u7528, \u8bf7\u52ff\u7528\u4e8e\u5546\u4e1a\u53ca\u975e\u6cd5\u7528\u9014, \u7531\u6b64\u5f15\u8d77\u7684\u4e00\u5207\u540e\u679c\u4e0e\u4f5c\u8005\u65e0\u5173\n\n\n\nhttps://github.com/iiicebearrr/spiders-for-all/assets/110714291/32e098d3-c85e-44c9-b54c-4820365b0235\n\n\n\n\n# Menu\n\n- [Menu](#menu)\n- [Quick Preview](#quick-preview)\n- [Installation](#installation)\n- [Documentation](#documentation)\n- [Roadmap](#roadmap)\n- [Customization](#customization)\n- [Find me](#find-me)\n- [Known Issues](#known-issues)\n\n# Quick Preview \n\n**\u6839\u636e\u7528\u6237id\u722c\u53d6b\u7ad9\u7528\u6237\u4e3b\u9875\u6295\u7a3f\u89c6\u9891**\n\n```sh\npython -m spiders_for_all bilibili download-by-author -m \u7528\u6237id -s \u4fdd\u5b58\u76ee\u5f55\n```\n\n**\u6839\u636enote_id\u6279\u91cf\u722c\u53d6\u5c0f\u7ea2\u4e66\u7b14\u8bb0\u5185\u5bb9**\n\n```sh\npython -m spiders_for_all xhs download-by-id -i note_id1,note_id2,note_id3 -s \u4fdd\u5b58\u76ee\u5f55\n```\n\n**\u66f4\u591a\u7528\u6cd5\u89c1[Documentation](#documentation)\u90e8\u4efd**\n\n# Installation\n\n```sh\npip install spiders-for-all # python \u7248\u672c >= 3.12\n```\n\n# Documentation\n\n**\u70b9\u51fb\u8fdb\u5165\u5bf9\u5e94\u5e73\u53f0\u7684\u4f7f\u7528\u6587\u6863**\n\n- [\u54d4\u54e9\u54d4\u54e9](./spiders_for_all/spiders/bilibili/README.md)\n\n- [\u5c0f\u7ea2\u4e66](./spiders_for_all/spiders/xhs/README.md)\n\n# Roadmap\n\n- bilibili\n - [x] \u7efc\u5408\u70ed\u95e8\u3001\u5165\u7ad9\u5fc5\u5237\u7b49\u680f\u76ee\u722c\u866b\n - [x] \u6839\u636ebvid\u722c\u53d6/\u6279\u91cf\u722c\u53d6\u89c6\u9891\n - [x] \u6839\u636e\u7528\u6237id\u722c\u53d6\u7528\u6237\u4e3b\u9875\u6295\u7a3f\u89c6\u9891\n - [x] \u722c\u53d6\u7528\u6237\u52a8\u6001\n- xhs\n - [x] \u6839\u636enote_id\u722c\u53d6/\u6279\u91cf\u722c\u53d6\u7b14\u8bb0\n - [x] \u6839\u636e\u7528\u6237id\u722c\u53d6\u7528\u6237\u4e3b\u9875\u9996\u9875\u7b14\u8bb0\n - [x] \u722c\u53d6\u7b14\u8bb0\u8bc4\u8bba\n- [ ] GUI\n\n# Customization\n\n**\u5982\u679c\u4f60\u60f3\u81ea\u5b9a\u4e49\u722c\u866b, \u53ef\u4ee5\u53c2\u8003\u4ee5\u4e0b\u6307\u5f15**\n\n\u9996\u5148\u9700\u8981\u4e86\u89e3\u4e00\u4e0b\u9879\u76ee\u722c\u866b\u7684\u57fa\u672c\u7ed3\u6784, \u53c2\u8003\u4e0b\u56fe:\n\n\n\n- `spiders_for_all/spiders`\u76ee\u5f55\u4e0b\u5b58\u653e\u5404\u4e2a\u5e73\u53f0\u7684\u722c\u866b, \u6bcf\u4e2a\u722c\u866b\u90fd\u662f\u4e00\u4e2a\u7c7b, \u7ee7\u627f\u81ea`spiders_for_all.core.spider.BaseSpider`\u7c7b, \u5e76\u5b9e\u73b0`BaseSpider`\u7c7b\u7684`get_items_from_response`\u65b9\u6cd5\n- \u7ee7\u627f\u81ea`BaseSpider`\u7684\u722c\u866b, \u4f1a\u6309\u7167\u4ee5\u4e0b\u6d41\u7a0b\u722c\u53d6\u76ee\u6807\u7f51\u7ad9\u6570\u636e\n - `self.__init__`: \u521d\u59cb\u5316\u722c\u866b\n - \u521d\u59cb\u5316logger\n - \u521d\u59cb\u5316\u6570\u636e\u5e93\u8fde\u63a5\n - \u521d\u59cb\u5316\u6570\u636e\u5e93\u8868, \u8fd9\u4e00\u6b65\u7684\u884c\u4e3a\u53d6\u51b3\u4e8e`cls.db_action_on_init`:\n - `spiders_for_all.core.spider.DbActionOnInit.CREATE_IF_NOT_EXISTS`: \u5982\u679c\u8868\u4e0d\u5b58\u5728\u5219\u521b\u5efa\n - `spiders_for_all.core.spider.DbActionOnInit.DROP_AND_CREATE`: \u76f4\u63a5\u5220\u9664\u5e76\u521b\u5efa\u8868\n - \u521d\u59cb\u5316client:\n - `client`\u662f\u4e00\u4e2a`spiders_for_all.core.client.HttpClient`\u5bf9\u8c61, \u5c01\u88c5\u4e86\u5931\u8d25\u91cd\u8bd5\u3001\u65e5\u5fd7\u3001\u4ee5\u53ca\u751f\u6210\u6a21\u62df\u8bf7\u6c42\u5934`user-agent`\u7684\u529f\u80fd\n - `self.run`: \u8fd0\u884c\u722c\u866b\n - `self.before`: \u8fd0\u884c\u722c\u866b\u524d\u6267\u884c\u7684\u4ee3\u7801, \u9ed8\u8ba4\u4f1a\u6253\u5370\u4e00\u6761\u65e5\u5fd7\n - `self.get_items`: \u4ece\u76ee\u6807\u7f51\u7ad9\u83b7\u53d6\u6570\u636e\n - `self.request_items`: \u5411`self.api`\u53d1\u9001\u8bf7\u6c42, \u83b7\u53d6\u8fd4\u56de\u3002\u4f1a\u8c03\u7528`self.get_request_args`\u83b7\u53d6\u989d\u5916\u7684\u8bf7\u6c42\u53c2\u6570\u6bd4\u5982`params`, `data`\u7b49\u7b49\n - `self.validate_response`: \u9a8c\u8bc1\u8fd4\u56de\u4f53\u7ed3\u6784, \u5982\u679c\u5b9a\u4e49\u4e86`self.response_model`, \u4f1a\u8c03\u7528`self.response_model`\u57fa\u4e8e`pydantic`\u5bf9\u8fd4\u56de\u4f53\u8fdb\u884c\u9a8c\u8bc1\u3002**\u6ce8\u610f\u8fd9\u91cc\u7684\u9a8c\u8bc1\u662f\u4e1a\u52a1\u9a8c\u8bc1, \u4e0d\u662fHTTP\u72b6\u6001\u7684\u9a8c\u8bc1, `self.client`\u4f1a\u81ea\u52a8\u9a8c\u8bc1HTTP\u72b6\u6001\u3002** \u5982\u679c\u6ca1\u6709\u5b9a\u4e49`self.response_model`, \u5219\u4e0d\u4f1a\u8fdb\u884c\u9a8c\u8bc1\uff0c\u76f4\u63a5\u8fd4\u56de\u539f\u59cb\u7684`requests.Response`\u5bf9\u8c61\n - `self.get_items_from_response`: \u4ece\u9a8c\u8bc1\u6210\u529f\u7684\u8fd4\u56de\u4f53\u4e2d\u53d6\u51fa\u5bf9\u5e94\u7684items, \u8fd4\u56deitems\u7684\u6570\u636e\u7c7b\u578b\u5b8c\u5168\u53d6\u51b3\u4e8e`cls.item_model`, \u5177\u4f53\u53ef\u4ee5\u53c2\u8003\u51e0\u4e2a\u722c\u866b\u7684\u5b9e\u73b0\n - **\u9700\u8981\u6ce8\u610f\u7684\u662f, `self.request_items()`, `self.validate_response`, `self.get_items_from_response`\u5747\u88ab\u5c01\u88c5\u5728\u4e00\u4e2a`retry`\u7684wrapper\u5185\uff0c\u8fd9\u4e5f\u5c31\u610f\u5473\u7740\u5982\u679c\u51fa\u73b0: `HTTP\u72b6\u6001\u7801\u975e200`, `\u4e1a\u52a1\u8fd4\u56de\u4f53\u5f02\u5e38, \u6bd4\u5982\u54d4\u54e9\u54d4\u54e9\u7684-352\u4e1a\u52a1\u7801`, `\u83b7\u53d6items\u5931\u8d25\uff0c\u6bd4\u5982\u7f51\u9875\u6ca1\u6709\u8fd4\u56de\u6b63\u5e38\u7684\u6570\u636e`\u8fd9\u4e9b\u60c5\u51b5, \u5747\u4f1a\u89e6\u53d1\u81ea\u52a8\u91cd\u8bd5**\n - `self.save_items`: \u5c06\u722c\u53d6\u7684\u6570\u636e\u4fdd\u5b58\u5230\u6570\u636e\u5e93\n - `self.after`: \u8fd0\u884c\u722c\u866b\u540e\u6267\u884c\u7684\u4ee3\u7801, \u9ed8\u8ba4\u4f1a\u6253\u5370\u4e00\u6761\u65e5\u5fd7\n\n- \u9664\u6b64\u4e4b\u5916, \u9700\u8981\u989d\u5916\u5b9a\u4e49\u4e00\u4e9b\u7c7b\u5c5e\u6027, \u793a\u4f8b\u5982\u4e0b\n \n ```python\n import typing as t\n import logging\n import requests\n from sqlalchemy import orm\n from pydantic import BaseModel\n from spiders_for_all.core.response import Response\n from spiders_for_all.core.spider import BaseSpider, DbActionOnInit, DbActionOnSave\n from spiders_for_all.database.session import SessionManager\n from spiders_for_all.utils.logger import default_logger\n\n class MySpider(BaseSpider):\n api: str # \u76ee\u6807API, \u5fc5\u987b\n name: str # \u722c\u866b\u540d\u79f0, \u5fc5\u987b\n alias: str # \u722c\u866b\u522b\u540d, \u5fc5\u987b\n platform: str # \u722c\u866b\u6240\u5c5e\u5e73\u53f0, \u5fc5\u987b\n description: str = \"\" # \u722c\u866b\u63cf\u8ff0, \u53ef\u9009\n\n database_model: t.Type[orm.DeclarativeBase] # \u6570\u636e\u5e93\u8868\u6a21\u578b, \u5fc5\u987b\n item_model: t.Type[BaseModel] # \u722c\u53d6\u7684\u6570\u636e\u6a21\u578b, \u5fc5\u987b\n response_model: t.Type[Response] | None = None # \u8fd4\u56de\u4f53\u6a21\u578b, \u53ef\u9009\n\n logger: logging.Logger = default_logger # \u65e5\u5fd7\u5bf9\u8c61, \u53ef\u9009\n session_manager: SessionManager # \u6570\u636e\u5e93\u8fde\u63a5\u7ba1\u7406\u5668, \u5fc5\u987b\n\n insert_batch_size: int = 100 # \u6279\u91cf\u63d2\u5165\u6570\u636e\u5e93\u65f6\u7684\u6279\u91cf\u5927\u5c0f, \u53ef\u9009\n\n db_action_on_init: DbActionOnInit = DbActionOnInit.CREATE_IF_NOT_EXIST # \u521d\u59cb\u5316\u6570\u636e\u5e93\u8868\u65f6\u7684\u884c\u4e3a, \u53ef\u9009\n db_action_on_save: DbActionOnSave = DbActionOnSave.DELETE_AND_CREATE # \u4fdd\u5b58\u6570\u636e\u65f6\u7684\u884c\u4e3a, \u53ef\u9009\n\n def get_items_from_response(self, response: requests.Response | Response) -> t.Iterable[BaseModel]:\n # \u4ece\u8fd4\u56de\u4f53\u4e2d\u53d6\u51faitems, \u5fc5\u987b\n pass\n ```\n\n# Find me\n\n- [\u5c0f\u7ea2\u4e66](https://www.xiaohongshu.com/user/profile/653619a2000000000d006d1a)\n- [bilibili](https://space.bilibili.com/634824012?spm_id_from=333.1007.0.0)\n \n\n# Known Issues\n\n- [x] (\u5df2\u89e3\u51b3)\u5c0f\u7ea2\u4e66\u722c\u53d6\u7528\u6237\u6295\u7a3f\u7684\u7b14\u8bb0\u65f6, \u7531\u4e8e\u5c0f\u7ea2\u4e66\u7b7e\u540d\u7b97\u6cd5\u7684\u95ee\u9898\u5c1a\u672a\u89e3\u51b3, \u53ea\u80fd\u722c\u53d6\u7528\u6237\u6295\u7a3f\u7684\u9996\u9875\u6570\u636e, \u9700\u8981\u4e0b\u62c9\u52a0\u8f7d\u7684\u6570\u636e\u6682\u65f6\u65e0\u6cd5\u722c\u53d6(\u7b7e\u540d\u7b97\u6cd5\u501f\u9274\u4e86[Spider-XHS](https://github.com/cv-cat/Spider_XHS.git))\n- [x] \u4f4e\u7248\u672c\u7684sqlite\u53ef\u80fd\u4e0d\u652f\u6301`ON CONFLICT DO UPDATE`\u8bed\u6cd5, \u5982\u679c\u9047\u5230\u8be5\u95ee\u9898\u8bf7\u5c1d\u8bd5\u5347\u7ea7sqlite\u7248\u672c \n- [x] \u5927\u6279\u91cf\u4e0b\u8f7d\u89c6\u9891\u65f6\u53ef\u80fd\u5076\u73b0\u4e0b\u8f7d\u4e2d\u65ad\u5bfc\u81f4\u5931\u8d25\u7684\u60c5\u51b5, \u8fd9\u79cd\u60c5\u51b5\u8bf7\u5728\u6240\u6709\u4efb\u52a1\u4e0b\u8f7d\u5b8c\u6210\u540e, \u8fd0\u884c`grep ERROR logs/bilibili.log`\u53ef\u4ee5\u67e5\u770b\u6709\u54ea\u4e9b\u89c6\u9891\u4e0b\u8f7d\u5931\u8d25, \u7136\u540e\u6536\u96c6bvid\u91cd\u65b0\u4e0b\u8f7d\u5373\u53ef\u3002\u672a\u6765\u4f1a\u8003\u8651\u5904\u7406\u8fd9\u79cd\u60c5\u51b5\n- [x] (\u5df2\u89e3\u51b3)bilibili\u8fd1\u671f(2024-01-20\u5de6\u53f3)\u66f4\u65b0\u4e86\u7528\u6237\u6295\u7a3f\u63a5\u53e3\u7684\u53cd\u722c\u7b56\u7565, \u65b0\u589e\u4e86`dm_img_inter`\u53c2\u6570, \u76ee\u524d\u4f5c\u8005\u8fd8\u6ca1\u6709\u65f6\u95f4\u5b9e\u73b0\u8be5\u53c2\u6570\u7684\u751f\u6210\u7b97\u6cd5, \u6240\u4ee5\u8be5\u63a5\u53e3\u6682\u65f6\u65e0\u6cd5\u4f7f\u7528, \u4f46\u662f\u5982\u679c\u4f60\u5b9e\u73b0\u4e86\u8be5\u53c2\u6570\u7684\u751f\u6210\u7b97\u6cd5, \u53ef\u4ee5\u901a\u8fc7\u4fee\u6539`spiders_for_all.spiders.bilibili.AuthorSpider`\u4e2d\u7684`get_request_args`\u65b9\u6cd5\u6765\u7ee7\u7eed\u4f7f\u7528\u8be5\u63a5\u53e3\n",
"bugtrack_url": null,
"license": "",
"summary": "",
"version": "0.3.9",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "60697c4673dabe0f5e58b0aa48d3ed579a40a73035ea8806ff7a9ce9e4aeb0ad",
"md5": "1c66584e3f881583d5d98a1fac082e5c",
"sha256": "8bf59379f089efd9c7135b5ceae127f0a51a307d594cdfb20946c90a8e45c557"
},
"downloads": -1,
"filename": "spiders_for_all-0.3.9-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1c66584e3f881583d5d98a1fac082e5c",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.12,<4.0",
"size": 94060,
"upload_time": "2024-02-15T07:30:27",
"upload_time_iso_8601": "2024-02-15T07:30:27.288430Z",
"url": "https://files.pythonhosted.org/packages/60/69/7c4673dabe0f5e58b0aa48d3ed579a40a73035ea8806ff7a9ce9e4aeb0ad/spiders_for_all-0.3.9-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9c3e4bd249769c2afef8a9640744ab1acaf48ed43ac90df651cae168d218496c",
"md5": "0ebe137aa30a58484fc94e5351b23af9",
"sha256": "83c11ccad23f462c648f1810e07bbc17bbbe76c0cee72201f90df0e66c02872c"
},
"downloads": -1,
"filename": "spiders_for_all-0.3.9.tar.gz",
"has_sig": false,
"md5_digest": "0ebe137aa30a58484fc94e5351b23af9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.12,<4.0",
"size": 83765,
"upload_time": "2024-02-15T07:30:28",
"upload_time_iso_8601": "2024-02-15T07:30:28.795360Z",
"url": "https://files.pythonhosted.org/packages/9c/3e/4bd249769c2afef8a9640744ab1acaf48ed43ac90df651cae168d218496c/spiders_for_all-0.3.9.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-02-15 07:30:28",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "spiders-for-all"
}