# Snowflake Data Validation
[](http://www.apache.org/licenses/LICENSE-2.0.txt)
[](https://www.python.org/downloads/)
**Snowflake Data Validation** is a command-line tool and Python library for validating data migrations and ensuring data quality between source and target databases, with a focus on Snowflake and SQL Server.
---
##### This package is in Private Preview.
---
## 🚀 Features
- **Multi-level validation**: schema, statistical metrics, and data integrity.
- **Database connectors**: support for SQL Server and Snowflake.
- **User-friendly CLI**: commands for automation and orchestration.
- **Flexible configuration**: YAML-based validation workflows.
- **Detailed reporting**: comprehensive reports and progress tracking.
- **Extensible**: architecture ready for more database engines.
---
## 📦 Installation
```bash
pip install snowflake-data-validation
```
For SQL Server support:
```bash
pip install "snowflake-data-validation[sqlserver]"
```
For development and testing:
```bash
pip install "snowflake-data-validation[all]"
```
---
## ⚡ Quick Start
Run a validation from SQL Server to Snowflake:
```bash
snowflake-data-validation sqlserver run-validation --data-validation-config-file ./config/conf.yaml
```
Or using the short alias:
```bash
sdv sqlserver run-validation --data-validation-config-file ./config/conf.yaml
```
---
## 🛠️ Configuration
Create a YAML file to define your validation workflow:
```yaml
source_platform: SqlServer
target_platform: Snowflake
output_directory_path: /path/to/output
parallelization: false
source_connection:
mode: credentials
host: "server"
port: 1433
username: "user"
password: "password"
database: "db"
target_connection:
mode: name
name: "SnowflakeConnection"
validation_configuration:
schema_validation: true
metrics_validation: true
row_validation: false
comparison_configuration:
tolerance: 0.01
tables:
- fully_qualified_name: database.schema.table1
use_column_selection_as_exclude_list: false
column_selection_list:
- column1
- column2
```
See the documentation for more advanced configuration examples.
---
## 🏗️ Architecture
- **CLI**: `main_cli.py`, `sqlserver_cli.py`, `snowflake_cli.py`
- **Connectors**: `connector/`
- **Extractors**: `extractor/`
- **Validation**: `validation/`
- **Configuration**: `configuration/`
- **Orchestrator**: `comparison_orchestrator.py`
Project structure:
```
snowflake-data-validation/
├── src/snowflake/snowflake_data_validation/
│ ├── main_cli.py
│ ├── sqlserver/
│ ├── snowflake/
│ ├── connector/
│ ├── extractor/
│ ├── validation/
│ ├── configuration/
│ ├── utils/
│ └── comparison_orchestrator.py
├── docs/
├── tests/
└── config_files/
```
---
## 📊 Reports
- Schema validation results
- Statistical comparison metrics
- Detailed error logs and recommendations
---
## 🤝 Contributing
We welcome contributions! See our [Contributing Guide](../../CONTRIBUTING.md) for details on how to collaborate, set up your development environment, and submit PRs.
---
## 📄 License
This project is licensed under the Apache License 2.0. See the [LICENSE](../../LICENSE) file for details.
---
## 🆘 Support
- **Documentation**: [Full documentation](https://github.com/snowflakedb/migrations-data-validation)
- **Issues**: [GitHub Issues](https://github.com/snowflakedb/migrations-data-validation/issues)
---
**Developed with ❄️ by Snowflake**
Raw data
{
"_id": null,
"home_page": null,
"name": "snowflake-data-validation",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.9",
"maintainer_email": null,
"keywords": "Snowflake, analytics, cloud, data, database, db, validation",
"author": null,
"author_email": "\"Snowflake, Inc.\" <snowflake-python-libraries-dl@snowflake.com>",
"download_url": "https://files.pythonhosted.org/packages/c3/49/142b69563eebe8447423e5843cd3a6265c26847ef06e3c94770eece65d9d/snowflake_data_validation-0.0.11.tar.gz",
"platform": null,
"description": "# Snowflake Data Validation\n\n[](http://www.apache.org/licenses/LICENSE-2.0.txt)\n[](https://www.python.org/downloads/)\n\n**Snowflake Data Validation** is a command-line tool and Python library for validating data migrations and ensuring data quality between source and target databases, with a focus on Snowflake and SQL Server.\n\n---\n\n##### This package is in Private Preview.\n\n---\n\n## \ud83d\ude80 Features\n\n- **Multi-level validation**: schema, statistical metrics, and data integrity.\n- **Database connectors**: support for SQL Server and Snowflake.\n- **User-friendly CLI**: commands for automation and orchestration.\n- **Flexible configuration**: YAML-based validation workflows.\n- **Detailed reporting**: comprehensive reports and progress tracking.\n- **Extensible**: architecture ready for more database engines.\n\n---\n\n## \ud83d\udce6 Installation\n\n```bash\npip install snowflake-data-validation\n```\n\nFor SQL Server support:\n\n```bash\npip install \"snowflake-data-validation[sqlserver]\"\n```\n\nFor development and testing:\n\n```bash\npip install \"snowflake-data-validation[all]\"\n```\n\n---\n\n## \u26a1 Quick Start\n\nRun a validation from SQL Server to Snowflake:\n\n```bash\nsnowflake-data-validation sqlserver run-validation --data-validation-config-file ./config/conf.yaml\n```\n\nOr using the short alias:\n\n```bash\nsdv sqlserver run-validation --data-validation-config-file ./config/conf.yaml\n```\n\n---\n\n## \ud83d\udee0\ufe0f Configuration\n\nCreate a YAML file to define your validation workflow:\n\n```yaml\nsource_platform: SqlServer\ntarget_platform: Snowflake\noutput_directory_path: /path/to/output\nparallelization: false\n\nsource_connection:\n mode: credentials\n host: \"server\"\n port: 1433\n username: \"user\"\n password: \"password\"\n database: \"db\"\n\ntarget_connection:\n mode: name\n name: \"SnowflakeConnection\"\n\nvalidation_configuration:\n schema_validation: true\n metrics_validation: true\n row_validation: false\n\ncomparison_configuration:\n tolerance: 0.01\n\ntables:\n - fully_qualified_name: database.schema.table1\n use_column_selection_as_exclude_list: false\n column_selection_list:\n - column1\n - column2\n```\n\nSee the documentation for more advanced configuration examples.\n\n---\n\n## \ud83c\udfd7\ufe0f Architecture\n\n- **CLI**: `main_cli.py`, `sqlserver_cli.py`, `snowflake_cli.py`\n- **Connectors**: `connector/`\n- **Extractors**: `extractor/`\n- **Validation**: `validation/`\n- **Configuration**: `configuration/`\n- **Orchestrator**: `comparison_orchestrator.py`\n\nProject structure:\n```\nsnowflake-data-validation/\n\u251c\u2500\u2500 src/snowflake/snowflake_data_validation/\n\u2502 \u251c\u2500\u2500 main_cli.py\n\u2502 \u251c\u2500\u2500 sqlserver/\n\u2502 \u251c\u2500\u2500 snowflake/\n\u2502 \u251c\u2500\u2500 connector/\n\u2502 \u251c\u2500\u2500 extractor/\n\u2502 \u251c\u2500\u2500 validation/\n\u2502 \u251c\u2500\u2500 configuration/\n\u2502 \u251c\u2500\u2500 utils/\n\u2502 \u2514\u2500\u2500 comparison_orchestrator.py\n\u251c\u2500\u2500 docs/\n\u251c\u2500\u2500 tests/\n\u2514\u2500\u2500 config_files/\n```\n\n---\n\n## \ud83d\udcca Reports\n\n- Schema validation results\n- Statistical comparison metrics\n- Detailed error logs and recommendations\n\n---\n\n## \ud83e\udd1d Contributing\n\nWe welcome contributions! See our [Contributing Guide](../../CONTRIBUTING.md) for details on how to collaborate, set up your development environment, and submit PRs.\n\n---\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the Apache License 2.0. See the [LICENSE](../../LICENSE) file for details.\n\n---\n\n## \ud83c\udd98 Support\n\n- **Documentation**: [Full documentation](https://github.com/snowflakedb/migrations-data-validation)\n- **Issues**: [GitHub Issues](https://github.com/snowflakedb/migrations-data-validation/issues)\n\n---\n\n**Developed with \u2744\ufe0f by Snowflake**\n",
"bugtrack_url": null,
"license": "Apache License, Version 2.0",
"summary": "Snowflake Data Validation",
"version": "0.0.11",
"project_urls": {
"Bug Tracker": "https://github.com/snowflakedb/migrations-data-validation/issues",
"Source code": "https://github.com/snowflakedb/migrations-data-validation/"
},
"split_keywords": [
"snowflake",
" analytics",
" cloud",
" data",
" database",
" db",
" validation"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "f0868f23c345f9fedbb649f611286410d32b9d3219821c3d69d3c2eb9d276c37",
"md5": "b11f8d96c6d10965af924b3b5dd24413",
"sha256": "3d1e5a74a7862072009a051887fc45a970d2e47f6a14d915365502af5622a8fa"
},
"downloads": -1,
"filename": "snowflake_data_validation-0.0.11-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b11f8d96c6d10965af924b3b5dd24413",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.9",
"size": 242486,
"upload_time": "2025-07-30T20:51:31",
"upload_time_iso_8601": "2025-07-30T20:51:31.634705Z",
"url": "https://files.pythonhosted.org/packages/f0/86/8f23c345f9fedbb649f611286410d32b9d3219821c3d69d3c2eb9d276c37/snowflake_data_validation-0.0.11-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "c349142b69563eebe8447423e5843cd3a6265c26847ef06e3c94770eece65d9d",
"md5": "c038e4d5b765b0adde3d58c4c8cc2417",
"sha256": "0d2794362513ce8fa0f17c7a87cfe08072eb7a094343f0beb395bec8b6e5e303"
},
"downloads": -1,
"filename": "snowflake_data_validation-0.0.11.tar.gz",
"has_sig": false,
"md5_digest": "c038e4d5b765b0adde3d58c4c8cc2417",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.13,>=3.9",
"size": 205532,
"upload_time": "2025-07-30T20:51:33",
"upload_time_iso_8601": "2025-07-30T20:51:33.230242Z",
"url": "https://files.pythonhosted.org/packages/c3/49/142b69563eebe8447423e5843cd3a6265c26847ef06e3c94770eece65d9d/snowflake_data_validation-0.0.11.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-30 20:51:33",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "snowflakedb",
"github_project": "migrations-data-validation",
"github_not_found": true,
"lcname": "snowflake-data-validation"
}