datamation


Namedatamation JSON
Version 0.1.5 PyPI version JSON
download
home_pageNone
SummaryData information synchronization
upload_time2025-07-24 02:34:19
maintainerNone
docs_urlNone
authorlidaoran
requires_python<4.0,>=3.7
licenseBSD
keywords etl synchronization
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Datamation

![PyPI - Python Version](https://img.shields.io/pypi/pyversions/sanic)

Datamation 是一个Python 数据处理工具库,专注于数据库操作、ETL数据同步和数据比对等功能。它提供了简单易用的方法,帮助开发者高效地处理数据迁移、同步等任务。

## 特性

- 支持不同数据库之间的大批量数据导入、导出和 ETL 数据同步
- 支持将关系型数据库数据导入到 ElasticSearch
- 提供数据库表结构差异比对,表数据差异比对,数据差异定位到行级
- 集成消息通知功能,支持邮件、钉钉、企业微信
- 提供灵活的数据转换和过滤功能

## 安装

### 基础安装

```bash
pip install datamation
```

## 基本使用

### 数据库连接

```python
import pymysql 
import datamation as dm

# 创建数据源连接
source_conn = lambda:pymysql.connect(
    host='127.0.0.1',
    user='root',
    passwd='xxx',
    database='demo',
    port=13307,
    write_timeout = 50000,
    connect_timeout = 20000
)

# 创建目标数据库连接
target_conn = lambda:pymysql.connect(
    host='127.0.0.1',
    user='root',
    passwd='xxx',
    database='demo',
    port=13306,
    read_timeout=50000,
    connect_timeout=10000
)
```

### 数据同步

#### 方式一:简单同步

```python
dm.sqlsync(
    source_conn=source_conn,
    src_table='table1',
    target_conn=target_conn,
    tag_table='table2',
    # filed_adapter ={dict:Json,list:Json}, # 字段类型转换
    batch_size=20000,  # 批量写入行数
    truncate=True      # 同步开始时使用truncate命令清空目标数据表
) 
```

#### 方式二:自定义同步

```python
source = dm.source_sql(source_conn, 'table1')
target = dm.table_basic(target_conn, 'table1', columns=['col1', 'col2', 'col3'], batch_size=20000)
for row in src:
    target.insert(row)
target.endload()
```

#### 数据导入到ElasticSearch

```python
from elasticsearch import Elasticsearch
es=Elasticsearch([{ "host":"xxx.xxx.xxx.xxx","port":xxxx}])

src = dm.source_sql(source_conn, 'table1')
target = dm.elastic_basic(es,index = 'table')
for row in src:
    target.insert(row)
target.endload()
```


#### postgreSQL高性能数据导入

```python
# pgtable类替代table_basic类,采用内部专有的批量方法来实现比pep249的executemany更高性能
# 只有psycopg2驱动需要 psycopg之后使用通用的table_basic类即可 
# 如果涉及到Jsonb字段,注意加参数filed_adapter ={dict:Json,list:Json}进行字段类型的转换

from datamation.db.extend import pgtable
from psycopg2.extras import Json

dm.sqlsync(
    source_conn=source_conn,
    src_table='table1',
    target_conn=target_conn,
    tag_table='table2',
    # filed_adapter ={dict:Json,list:Json}, # 字段类型转换
    batch_size=20000,  # 批量写入行数
    truncate=True,      # 同步开始时使用truncate命令清空目标数据表
    table_cls = pgtable

) 

source = dm.source_sql(source_conn, 'table1')
target = pgtable(target_conn, 'table1', columns=['col1', 'col2', 'col3'], batch_size=20000)
for row in src:
    target.insert(row)
target.endload()

```

### 数据比对

#### 表结构比对

基于 SQLAlchemy 的表结构差异比对:

```python

res = dm.get_sqldb_diff(source_conn,target_conn, "postgresql")
for sql in res:
    print(sql)
```

#### 数据内容比对

1. 数值类型主键比对:

```python
comp = dm.get_sqldata_diff2(source_conn, target_conn, 'tb_data', 'tb_data_copy1', compare_field='id')
print(comp)
```

2. UUID 类型主键比对:

```python
comp = dm.get_sqldata_diff1(source_conn, target_conn, 'tb_data', 'tb_data_copy1', compare_field='id')
print(comp)
```

3. 进行数据量比对:

```python
# 主键id为整数类型
import datamation as dm
comp = dm.get_sqldata_diff2(conn,conn,'yth_subject_copy1','yth_subject_copy2',
                            compare_field ='*',
                            source_dbms_hash=dm.dbms_hash_count,
                            target_dbms_hash=dm.dbms_hash_count,
                            partition_field='id')

print('输出差异结果',comp.result)
print(time.ctime())
```

### 消息通知

#### 钉钉通知

```python
import datamation as dm

# 创建钉钉机器人实例
ding = dm.to_dingtalk(webhook="钉钉机器人webhook地址", secret="安全设置的加签密钥")

# 发送文本消息
ding.send_text("Hello World", at_mobiles=["13800000000"], at_all=False)

# 发送Markdown消息
ding.send_markdown(
    title="标题",
    text="**加粗文本**\n普通文本\n![图片](https://img.png)",
    at_mobiles=["130xxxxxxxx"],
    at_all=False
)

# 发送图片消息
ding.send_image("/path/to/image.png")
```

#### 企业微信通知

```python
import datamation as dm

# 创建企业微信机器人实例
wechat = dm.to_wechat(webhook="企业微信机器人webhook地址")

# 发送文本消息
wechat.send_text("Hello World", mentioned_list=["@all"], mentioned_mobile_list=["13800138000"])

# 发送Markdown消息
wechat.send_markdown("**加粗文本**\n普通文本\n[链接](https://example.com)")

# 发送图片消息
wechat.send_image("/path/to/image.png")
```

#### 邮件通知

方式一:链式调用

```python
import datamation as dm

tm = dm.to_mail(user, passwd, host)
tm.name('hello world',
       to=['xxx@xx.com', 'xxx@xxx.com'],
       cc=['xxx@xx.com', 'xxx@xxx.com'],
       bcc=['xxx@xx.com', 'xxx@xxx.com'],
       showname='datamation')
tm.add_text('hello world')
tm.add_html('<p>hello world</p> <img src=cid:image001.jpg style="height:71px; width:116px" />')
tm.add_related({'image001.jpg': 'data/image001.jpg'})  # 添加在html中引用显示的图片内容
tm.add_attachment({'data.xlsx': '/data/data.xlsx'})    # 添加附件
tm.send()
```

方式二:一次性发送

```python
import datamation as dm

tm = dm.to_mail(user, passwd, host)
tm.send('hello world',
        to=['xxx@xx.com'],
        cc=[''],
        bcc=[''],
        showname='datamation',
        related={'image001.jpg': 'data/image001.jpg'},
        attachment={'data.xlsx': '/data/data.xlsx'})
```

### 数据备份

支持调用pg_dump命令进行PostgreSQL数据库备份,可以指定数据库环境、模式和表名,备份后自动压缩为zip文件。

```python
from datamation.db.extend import pg_dump

# 配置数据库连接信息
dbcfg = {
    'dev': {
        'dbname': 'postgres',
        'user': 'postgres',
        'password': 'xxx',
        'host': '127.0.0.1',
        'port': '5432'
    }
}

# 备份指定表
pg_dump(
    dbname=dbcfg['dev']['dbname'],
    user=dbcfg['dev']['user'],
    password=dbcfg['dev']['password'],
    host=dbcfg['dev']['host'],
    port=dbcfg['dev']['port'],
    output_dir='/path/to/backup/file.sql',
    column_inserts=True,  # 使用INSERT语句格式
    compress=False,       # 是否压缩
    include_table=['schema.table_name']  # 指定要备份的表
)
```

## 许可证

BSD License
## 作者

lidaoran (qianxuanyon@hotmail.com)


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "datamation",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.7",
    "maintainer_email": null,
    "keywords": "etl, synchronization",
    "author": "lidaoran",
    "author_email": "qianxuanyon@hotmail.com",
    "download_url": "https://files.pythonhosted.org/packages/36/6e/4fbae28a0943810eab51e51dbe4272742375d94819069b5a6a83f9cb0785/datamation-0.1.5.tar.gz",
    "platform": null,
    "description": "# Datamation\n\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/sanic)\n\nDatamation \u662f\u4e00\u4e2aPython \u6570\u636e\u5904\u7406\u5de5\u5177\u5e93\uff0c\u4e13\u6ce8\u4e8e\u6570\u636e\u5e93\u64cd\u4f5c\u3001ETL\u6570\u636e\u540c\u6b65\u548c\u6570\u636e\u6bd4\u5bf9\u7b49\u529f\u80fd\u3002\u5b83\u63d0\u4f9b\u4e86\u7b80\u5355\u6613\u7528\u7684\u65b9\u6cd5\uff0c\u5e2e\u52a9\u5f00\u53d1\u8005\u9ad8\u6548\u5730\u5904\u7406\u6570\u636e\u8fc1\u79fb\u3001\u540c\u6b65\u7b49\u4efb\u52a1\u3002\n\n## \u7279\u6027\n\n- \u652f\u6301\u4e0d\u540c\u6570\u636e\u5e93\u4e4b\u95f4\u7684\u5927\u6279\u91cf\u6570\u636e\u5bfc\u5165\u3001\u5bfc\u51fa\u548c ETL \u6570\u636e\u540c\u6b65\n- \u652f\u6301\u5c06\u5173\u7cfb\u578b\u6570\u636e\u5e93\u6570\u636e\u5bfc\u5165\u5230 ElasticSearch\n- \u63d0\u4f9b\u6570\u636e\u5e93\u8868\u7ed3\u6784\u5dee\u5f02\u6bd4\u5bf9\uff0c\u8868\u6570\u636e\u5dee\u5f02\u6bd4\u5bf9\uff0c\u6570\u636e\u5dee\u5f02\u5b9a\u4f4d\u5230\u884c\u7ea7\n- \u96c6\u6210\u6d88\u606f\u901a\u77e5\u529f\u80fd\uff0c\u652f\u6301\u90ae\u4ef6\u3001\u9489\u9489\u3001\u4f01\u4e1a\u5fae\u4fe1\n- \u63d0\u4f9b\u7075\u6d3b\u7684\u6570\u636e\u8f6c\u6362\u548c\u8fc7\u6ee4\u529f\u80fd\n\n## \u5b89\u88c5\n\n### \u57fa\u7840\u5b89\u88c5\n\n```bash\npip install datamation\n```\n\n## \u57fa\u672c\u4f7f\u7528\n\n### \u6570\u636e\u5e93\u8fde\u63a5\n\n```python\nimport pymysql \nimport datamation as dm\n\n# \u521b\u5efa\u6570\u636e\u6e90\u8fde\u63a5\nsource_conn = lambda:pymysql.connect(\n    host='127.0.0.1',\n    user='root',\n    passwd='xxx',\n    database='demo',\n    port=13307,\n    write_timeout = 50000,\n    connect_timeout = 20000\n)\n\n# \u521b\u5efa\u76ee\u6807\u6570\u636e\u5e93\u8fde\u63a5\ntarget_conn = lambda:pymysql.connect(\n    host='127.0.0.1',\n    user='root',\n    passwd='xxx',\n    database='demo',\n    port=13306,\n    read_timeout=50000,\n    connect_timeout=10000\n)\n```\n\n### \u6570\u636e\u540c\u6b65\n\n#### \u65b9\u5f0f\u4e00\uff1a\u7b80\u5355\u540c\u6b65\n\n```python\ndm.sqlsync(\n    source_conn=source_conn,\n    src_table='table1',\n    target_conn=target_conn,\n    tag_table='table2',\n    # filed_adapter ={dict:Json,list:Json}, # \u5b57\u6bb5\u7c7b\u578b\u8f6c\u6362\n    batch_size=20000,  # \u6279\u91cf\u5199\u5165\u884c\u6570\n    truncate=True      # \u540c\u6b65\u5f00\u59cb\u65f6\u4f7f\u7528truncate\u547d\u4ee4\u6e05\u7a7a\u76ee\u6807\u6570\u636e\u8868\n) \n```\n\n#### \u65b9\u5f0f\u4e8c\uff1a\u81ea\u5b9a\u4e49\u540c\u6b65\n\n```python\nsource = dm.source_sql(source_conn, 'table1')\ntarget = dm.table_basic(target_conn, 'table1', columns=['col1', 'col2', 'col3'], batch_size=20000)\nfor row in src:\n    target.insert(row)\ntarget.endload()\n```\n\n#### \u6570\u636e\u5bfc\u5165\u5230ElasticSearch\n\n```python\nfrom elasticsearch import Elasticsearch\nes=Elasticsearch([{ \"host\":\"xxx.xxx.xxx.xxx\",\"port\":xxxx}])\n\nsrc = dm.source_sql(source_conn, 'table1')\ntarget = dm.elastic_basic(es,index = 'table')\nfor row in src:\n    target.insert(row)\ntarget.endload()\n```\n\n\n#### postgreSQL\u9ad8\u6027\u80fd\u6570\u636e\u5bfc\u5165\n\n```python\n# pgtable\u7c7b\u66ff\u4ee3table_basic\u7c7b\uff0c\u91c7\u7528\u5185\u90e8\u4e13\u6709\u7684\u6279\u91cf\u65b9\u6cd5\u6765\u5b9e\u73b0\u6bd4pep249\u7684executemany\u66f4\u9ad8\u6027\u80fd\n# \u53ea\u6709psycopg2\u9a71\u52a8\u9700\u8981 psycopg\u4e4b\u540e\u4f7f\u7528\u901a\u7528\u7684table_basic\u7c7b\u5373\u53ef \n# \u5982\u679c\u6d89\u53ca\u5230Jsonb\u5b57\u6bb5\uff0c\u6ce8\u610f\u52a0\u53c2\u6570filed_adapter ={dict:Json,list:Json}\u8fdb\u884c\u5b57\u6bb5\u7c7b\u578b\u7684\u8f6c\u6362\n\nfrom datamation.db.extend import pgtable\nfrom psycopg2.extras import Json\n\ndm.sqlsync(\n    source_conn=source_conn,\n    src_table='table1',\n    target_conn=target_conn,\n    tag_table='table2',\n    # filed_adapter ={dict:Json,list:Json}, # \u5b57\u6bb5\u7c7b\u578b\u8f6c\u6362\n    batch_size=20000,  # \u6279\u91cf\u5199\u5165\u884c\u6570\n    truncate=True,      # \u540c\u6b65\u5f00\u59cb\u65f6\u4f7f\u7528truncate\u547d\u4ee4\u6e05\u7a7a\u76ee\u6807\u6570\u636e\u8868\n    table_cls = pgtable\n\n) \n\nsource = dm.source_sql(source_conn, 'table1')\ntarget = pgtable(target_conn, 'table1', columns=['col1', 'col2', 'col3'], batch_size=20000)\nfor row in src:\n    target.insert(row)\ntarget.endload()\n\n```\n\n### \u6570\u636e\u6bd4\u5bf9\n\n#### \u8868\u7ed3\u6784\u6bd4\u5bf9\n\n\u57fa\u4e8e SQLAlchemy \u7684\u8868\u7ed3\u6784\u5dee\u5f02\u6bd4\u5bf9\uff1a\n\n```python\n\nres = dm.get_sqldb_diff(source_conn,target_conn, \"postgresql\")\nfor sql in res:\n    print(sql)\n```\n\n#### \u6570\u636e\u5185\u5bb9\u6bd4\u5bf9\n\n1. \u6570\u503c\u7c7b\u578b\u4e3b\u952e\u6bd4\u5bf9\uff1a\n\n```python\ncomp = dm.get_sqldata_diff2(source_conn, target_conn, 'tb_data', 'tb_data_copy1', compare_field='id')\nprint(comp)\n```\n\n2. UUID \u7c7b\u578b\u4e3b\u952e\u6bd4\u5bf9\uff1a\n\n```python\ncomp = dm.get_sqldata_diff1(source_conn, target_conn, 'tb_data', 'tb_data_copy1', compare_field='id')\nprint(comp)\n```\n\n3. \u8fdb\u884c\u6570\u636e\u91cf\u6bd4\u5bf9\uff1a\n\n```python\n# \u4e3b\u952eid\u4e3a\u6574\u6570\u7c7b\u578b\nimport datamation as dm\ncomp = dm.get_sqldata_diff2(conn,conn,'yth_subject_copy1','yth_subject_copy2',\n                            compare_field ='*',\n                            source_dbms_hash=dm.dbms_hash_count,\n                            target_dbms_hash=dm.dbms_hash_count,\n                            partition_field='id')\n\nprint('\u8f93\u51fa\u5dee\u5f02\u7ed3\u679c',comp.result)\nprint(time.ctime())\n```\n\n### \u6d88\u606f\u901a\u77e5\n\n#### \u9489\u9489\u901a\u77e5\n\n```python\nimport datamation as dm\n\n# \u521b\u5efa\u9489\u9489\u673a\u5668\u4eba\u5b9e\u4f8b\nding = dm.to_dingtalk(webhook=\"\u9489\u9489\u673a\u5668\u4ebawebhook\u5730\u5740\", secret=\"\u5b89\u5168\u8bbe\u7f6e\u7684\u52a0\u7b7e\u5bc6\u94a5\")\n\n# \u53d1\u9001\u6587\u672c\u6d88\u606f\nding.send_text(\"Hello World\", at_mobiles=[\"13800000000\"], at_all=False)\n\n# \u53d1\u9001Markdown\u6d88\u606f\nding.send_markdown(\n    title=\"\u6807\u9898\",\n    text=\"**\u52a0\u7c97\u6587\u672c**\\n\u666e\u901a\u6587\u672c\\n![\u56fe\u7247](https://img.png)\",\n    at_mobiles=[\"130xxxxxxxx\"],\n    at_all=False\n)\n\n# \u53d1\u9001\u56fe\u7247\u6d88\u606f\nding.send_image(\"/path/to/image.png\")\n```\n\n#### \u4f01\u4e1a\u5fae\u4fe1\u901a\u77e5\n\n```python\nimport datamation as dm\n\n# \u521b\u5efa\u4f01\u4e1a\u5fae\u4fe1\u673a\u5668\u4eba\u5b9e\u4f8b\nwechat = dm.to_wechat(webhook=\"\u4f01\u4e1a\u5fae\u4fe1\u673a\u5668\u4ebawebhook\u5730\u5740\")\n\n# \u53d1\u9001\u6587\u672c\u6d88\u606f\nwechat.send_text(\"Hello World\", mentioned_list=[\"@all\"], mentioned_mobile_list=[\"13800138000\"])\n\n# \u53d1\u9001Markdown\u6d88\u606f\nwechat.send_markdown(\"**\u52a0\u7c97\u6587\u672c**\\n\u666e\u901a\u6587\u672c\\n[\u94fe\u63a5](https://example.com)\")\n\n# \u53d1\u9001\u56fe\u7247\u6d88\u606f\nwechat.send_image(\"/path/to/image.png\")\n```\n\n#### \u90ae\u4ef6\u901a\u77e5\n\n\u65b9\u5f0f\u4e00\uff1a\u94fe\u5f0f\u8c03\u7528\n\n```python\nimport datamation as dm\n\ntm = dm.to_mail(user, passwd, host)\ntm.name('hello world',\n       to=['xxx@xx.com', 'xxx@xxx.com'],\n       cc=['xxx@xx.com', 'xxx@xxx.com'],\n       bcc=['xxx@xx.com', 'xxx@xxx.com'],\n       showname='datamation')\ntm.add_text('hello world')\ntm.add_html('<p>hello world</p> <img src=cid:image001.jpg style=\"height:71px; width:116px\" />')\ntm.add_related({'image001.jpg': 'data/image001.jpg'})  # \u6dfb\u52a0\u5728html\u4e2d\u5f15\u7528\u663e\u793a\u7684\u56fe\u7247\u5185\u5bb9\ntm.add_attachment({'data.xlsx': '/data/data.xlsx'})    # \u6dfb\u52a0\u9644\u4ef6\ntm.send()\n```\n\n\u65b9\u5f0f\u4e8c\uff1a\u4e00\u6b21\u6027\u53d1\u9001\n\n```python\nimport datamation as dm\n\ntm = dm.to_mail(user, passwd, host)\ntm.send('hello world',\n        to=['xxx@xx.com'],\n        cc=[''],\n        bcc=[''],\n        showname='datamation',\n        related={'image001.jpg': 'data/image001.jpg'},\n        attachment={'data.xlsx': '/data/data.xlsx'})\n```\n\n### \u6570\u636e\u5907\u4efd\n\n\u652f\u6301\u8c03\u7528pg_dump\u547d\u4ee4\u8fdb\u884cPostgreSQL\u6570\u636e\u5e93\u5907\u4efd,\u53ef\u4ee5\u6307\u5b9a\u6570\u636e\u5e93\u73af\u5883\u3001\u6a21\u5f0f\u548c\u8868\u540d\uff0c\u5907\u4efd\u540e\u81ea\u52a8\u538b\u7f29\u4e3azip\u6587\u4ef6\u3002\n\n```python\nfrom datamation.db.extend import pg_dump\n\n# \u914d\u7f6e\u6570\u636e\u5e93\u8fde\u63a5\u4fe1\u606f\ndbcfg = {\n    'dev': {\n        'dbname': 'postgres',\n        'user': 'postgres',\n        'password': 'xxx',\n        'host': '127.0.0.1',\n        'port': '5432'\n    }\n}\n\n# \u5907\u4efd\u6307\u5b9a\u8868\npg_dump(\n    dbname=dbcfg['dev']['dbname'],\n    user=dbcfg['dev']['user'],\n    password=dbcfg['dev']['password'],\n    host=dbcfg['dev']['host'],\n    port=dbcfg['dev']['port'],\n    output_dir='/path/to/backup/file.sql',\n    column_inserts=True,  # \u4f7f\u7528INSERT\u8bed\u53e5\u683c\u5f0f\n    compress=False,       # \u662f\u5426\u538b\u7f29\n    include_table=['schema.table_name']  # \u6307\u5b9a\u8981\u5907\u4efd\u7684\u8868\n)\n```\n\n## \u8bb8\u53ef\u8bc1\n\nBSD License\n## \u4f5c\u8005\n\nlidaoran (qianxuanyon@hotmail.com)\n\n",
    "bugtrack_url": null,
    "license": "BSD",
    "summary": "Data information synchronization",
    "version": "0.1.5",
    "project_urls": null,
    "split_keywords": [
        "etl",
        " synchronization"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "bbb518162c6698425e0e7be925621d0e28b4ff9f8ffc1525603e07e7a8c4045f",
                "md5": "58e682e278bdfb48c48fb8bbabbafe81",
                "sha256": "b0a894426c14835d668daa7aee3955f267e6f8f2bed44dd9b430306a0437f1e1"
            },
            "downloads": -1,
            "filename": "datamation-0.1.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "58e682e278bdfb48c48fb8bbabbafe81",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.7",
            "size": 88492,
            "upload_time": "2025-07-24T02:34:09",
            "upload_time_iso_8601": "2025-07-24T02:34:09.513293Z",
            "url": "https://files.pythonhosted.org/packages/bb/b5/18162c6698425e0e7be925621d0e28b4ff9f8ffc1525603e07e7a8c4045f/datamation-0.1.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "366e4fbae28a0943810eab51e51dbe4272742375d94819069b5a6a83f9cb0785",
                "md5": "a5a55027b839dacbc3558d0fc7134010",
                "sha256": "40a5444952e3cf0fa823fb084748a6ae2bf7490784e5249fc793ba03a240deed"
            },
            "downloads": -1,
            "filename": "datamation-0.1.5.tar.gz",
            "has_sig": false,
            "md5_digest": "a5a55027b839dacbc3558d0fc7134010",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.7",
            "size": 75300,
            "upload_time": "2025-07-24T02:34:19",
            "upload_time_iso_8601": "2025-07-24T02:34:19.611540Z",
            "url": "https://files.pythonhosted.org/packages/36/6e/4fbae28a0943810eab51e51dbe4272742375d94819069b5a6a83f9cb0785/datamation-0.1.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-24 02:34:19",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "datamation"
}
        
Elapsed time: 2.59479s