## Spare Cores Crawler
[](https://github.com/SpareCores/sc-crawler/actions/workflows/tests.yaml)
<picture><source media="(prefers-color-scheme: dark)" srcset="https://img.shields.io/badge/status-alpha-blue"><source media="(prefers-color-scheme: light)" srcset="https://img.shields.io/badge/status-alpha-blue"><img alt="Project Status: Alpha" src="https://img.shields.io/badge/status-alpha-blue"></picture>
<picture><source media="(prefers-color-scheme: dark)" srcset="https://img.shields.io/maintenance/yes/2024"><source media="(prefers-color-scheme: light)" srcset="https://img.shields.io/maintenance/yes/2024"><img alt="Maintenance Status: Active" src="https://img.shields.io/maintenance/yes/2024"></picture>
[](https://github.com/SpareCores/sc-crawler/blob/main/LICENSE)
[](https://pypi.org/project/sparecores-crawler/)
[](https://www.ngisearch.eu/view/Events/OC3Searchers)
SC Crawler is a Python package to pull and standardize data on cloud
compute resources, with tooling to help organize and update the
collected data into databases.
## Installation
Stable version from PyPI:
```
pip install sparecores-crawler
```
Most recent version from GitHub:
```
pip install "sparecores-crawler @ git+https://git@github.com/SpareCores/sc-crawler.git"
```
## References
- [Package documentation](https://sparecores.github.io/sc-crawler/)
- [Database schemas](https://dbdocs.io/spare-cores/sc-crawler)
- [Latest SQLite database release](https://sc-data-public-40e9d310.s3.amazonaws.com/sc-data-all.db.bz2)
- [sparecores.com](https://sparecores.com)
Raw data
{
"_id": null,
"home_page": null,
"name": "sparecores-crawler",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": "Spare Cores team <pkg@sparecores.com>",
"keywords": "cloud, compute, etl, sqlite, spot-instances, cost-optimization",
"author": "Attila Nagy, Gergely Daroczi, Balazs Hodobay",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/e2/d4/e7a1c641933d6942a90670aa82bc82dfe4ddd3c2a7a2760c1e59739d1026/sparecores_crawler-0.3.1.tar.gz",
"platform": null,
"description": "## Spare Cores Crawler\n\n[](https://github.com/SpareCores/sc-crawler/actions/workflows/tests.yaml)\n<picture><source media=\"(prefers-color-scheme: dark)\" srcset=\"https://img.shields.io/badge/status-alpha-blue\"><source media=\"(prefers-color-scheme: light)\" srcset=\"https://img.shields.io/badge/status-alpha-blue\"><img alt=\"Project Status: Alpha\" src=\"https://img.shields.io/badge/status-alpha-blue\"></picture>\n<picture><source media=\"(prefers-color-scheme: dark)\" srcset=\"https://img.shields.io/maintenance/yes/2024\"><source media=\"(prefers-color-scheme: light)\" srcset=\"https://img.shields.io/maintenance/yes/2024\"><img alt=\"Maintenance Status: Active\" src=\"https://img.shields.io/maintenance/yes/2024\"></picture>\n[](https://github.com/SpareCores/sc-crawler/blob/main/LICENSE)\n[](https://pypi.org/project/sparecores-crawler/)\n[](https://www.ngisearch.eu/view/Events/OC3Searchers)\n\nSC Crawler is a Python package to pull and standardize data on cloud\ncompute resources, with tooling to help organize and update the\ncollected data into databases.\n\n## Installation\n\nStable version from PyPI:\n\n```\npip install sparecores-crawler\n```\n\nMost recent version from GitHub:\n\n```\npip install \"sparecores-crawler @ git+https://git@github.com/SpareCores/sc-crawler.git\"\n```\n\n## References\n\n- [Package documentation](https://sparecores.github.io/sc-crawler/)\n- [Database schemas](https://dbdocs.io/spare-cores/sc-crawler)\n- [Latest SQLite database release](https://sc-data-public-40e9d310.s3.amazonaws.com/sc-data-all.db.bz2)\n- [sparecores.com](https://sparecores.com)\n",
"bugtrack_url": null,
"license": null,
"summary": "Pull and standardize data on cloud compute resources.",
"version": "0.3.1",
"project_urls": {
"documentation": "https://sparecores.github.io/sc-crawler/",
"homepage": "https://sparecores.com",
"issues": "https://github.com/SpareCores/sc-crawler/issues",
"repository": "https://github.com/SpareCores/sc-crawler"
},
"split_keywords": [
"cloud",
" compute",
" etl",
" sqlite",
" spot-instances",
" cost-optimization"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "df8c747ddce79207674671f5b23496f0d97828b29c8bc354263a12c279c84235",
"md5": "842005705cfacab083cf8863dc6ce9a8",
"sha256": "c3435bda5993fcbe66d877301061bbfe41e5e89d6eefca034e50de2c65c5fd99"
},
"downloads": -1,
"filename": "sparecores_crawler-0.3.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "842005705cfacab083cf8863dc6ce9a8",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 110360,
"upload_time": "2024-10-25T11:22:49",
"upload_time_iso_8601": "2024-10-25T11:22:49.037792Z",
"url": "https://files.pythonhosted.org/packages/df/8c/747ddce79207674671f5b23496f0d97828b29c8bc354263a12c279c84235/sparecores_crawler-0.3.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e2d4e7a1c641933d6942a90670aa82bc82dfe4ddd3c2a7a2760c1e59739d1026",
"md5": "8d28d5fa94d63ee23604f0ccf5b1ae62",
"sha256": "6b520eee332b8eac6ee76c116358852790869e12d272c0e50cd4acced3aa6b10"
},
"downloads": -1,
"filename": "sparecores_crawler-0.3.1.tar.gz",
"has_sig": false,
"md5_digest": "8d28d5fa94d63ee23604f0ccf5b1ae62",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 100272,
"upload_time": "2024-10-25T11:22:50",
"upload_time_iso_8601": "2024-10-25T11:22:50.778887Z",
"url": "https://files.pythonhosted.org/packages/e2/d4/e7a1c641933d6942a90670aa82bc82dfe4ddd3c2a7a2760c1e59739d1026/sparecores_crawler-0.3.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-25 11:22:50",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "SpareCores",
"github_project": "sc-crawler",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"tox": true,
"lcname": "sparecores-crawler"
}