# Spark Streaming Metrics Prometheus Instrumentation
This project provides a seamless integration between PySpark and Prometheus for monitoring Spark **Structured Streaming** applications.
> *Note:* this project focuses on better metrics for Spark Structured Streaming specifically. If you would like other Spark metrics such as executor memory, CPU, GC times, etc. in Prometheus please refer to Spark's monitoring guide and its support for Prometheus using JMX (Java Management Extensions).
## Features
- Collects metrics from PySpark Streaming Queries
- Exposes metrics in Prometheus format
- Easy integration with existing PySpark applications
## Installation
To install the required dependencies, run:
```bash
pip install -r requirements.txt
```
## Usage
1. Import the necessary modules in your PySpark application:
```python
from pyspark_prometheus import with_prometheus_metrics
```
2. Initialize the Prometheus metrics:
```python
spark = SparkSession.builder.master("local").appName("MySparkApp").getOrCreate()
spark = with_prometheus_metrics(spark, 'http://localhost:9091')
```
3. Start your PySpark job as usual. Metrics will be collected and exposed automatically.
## Contributing
Contributions are welcome! Please submit a pull request or open an issue to discuss your ideas.
## License
This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
## Contact
For any questions or support, please open an issue in the repository.
****
Raw data
{
"_id": null,
"home_page": "https://github.com/zcking/pyspark_prometheus",
"name": "pyspark-prometheus",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": null,
"keywords": "prometheus, spark, streaming, metrics",
"author": "Zachary King",
"author_email": "zach@makewithdata.tech",
"download_url": "https://files.pythonhosted.org/packages/8c/43/d6be6327602d7b519e03b13a01087641daf92ea4c7ea31e184ac149a2fea/pyspark_prometheus-0.1.3.tar.gz",
"platform": null,
"description": "# Spark Streaming Metrics Prometheus Instrumentation\n\nThis project provides a seamless integration between PySpark and Prometheus for monitoring Spark **Structured Streaming** applications.\n\n> *Note:* this project focuses on better metrics for Spark Structured Streaming specifically. If you would like other Spark metrics such as executor memory, CPU, GC times, etc. in Prometheus please refer to Spark's monitoring guide and its support for Prometheus using JMX (Java Management Extensions).\n\n## Features\n\n- Collects metrics from PySpark Streaming Queries\n- Exposes metrics in Prometheus format\n- Easy integration with existing PySpark applications\n\n## Installation\n\nTo install the required dependencies, run:\n\n```bash\npip install -r requirements.txt\n```\n\n## Usage\n\n1. Import the necessary modules in your PySpark application:\n\n ```python\n from pyspark_prometheus import with_prometheus_metrics\n ```\n\n2. Initialize the Prometheus metrics:\n\n ```python\n spark = SparkSession.builder.master(\"local\").appName(\"MySparkApp\").getOrCreate()\n spark = with_prometheus_metrics(spark, 'http://localhost:9091')\n ```\n\n3. Start your PySpark job as usual. Metrics will be collected and exposed automatically.\n\n## Contributing\n\nContributions are welcome! Please submit a pull request or open an issue to discuss your ideas.\n\n## License\n\nThis project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.\n\n## Contact\n\nFor any questions or support, please open an issue in the repository.\n****",
"bugtrack_url": null,
"license": "MIT",
"summary": "Prometheus instrumentation for Spark Streaming metrics.",
"version": "0.1.3",
"project_urls": {
"Homepage": "https://github.com/zcking/pyspark_prometheus",
"Repository": "https://github.com/zcking/pyspark_prometheus"
},
"split_keywords": [
"prometheus",
" spark",
" streaming",
" metrics"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "2e14625acb3aa212e49adb4780368fb3fb87bb5ba741c9c62dc0c4ca94f6930a",
"md5": "a2dffd2533704eddfcf3f8357587ae72",
"sha256": "52f7d8b0e9aca3fe2be3e7cdf2ff7ed51a28802333e4a4a17b72d849bc348f87"
},
"downloads": -1,
"filename": "pyspark_prometheus-0.1.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a2dffd2533704eddfcf3f8357587ae72",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 4764,
"upload_time": "2024-10-20T21:55:33",
"upload_time_iso_8601": "2024-10-20T21:55:33.785262Z",
"url": "https://files.pythonhosted.org/packages/2e/14/625acb3aa212e49adb4780368fb3fb87bb5ba741c9c62dc0c4ca94f6930a/pyspark_prometheus-0.1.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "8c43d6be6327602d7b519e03b13a01087641daf92ea4c7ea31e184ac149a2fea",
"md5": "f328eb6c6f5b3ceded1d7719fa71402b",
"sha256": "03da747a4ae99f738da4e955d3fdb970e0915fbd3d4cfe47a4f20090baa5051a"
},
"downloads": -1,
"filename": "pyspark_prometheus-0.1.3.tar.gz",
"has_sig": false,
"md5_digest": "f328eb6c6f5b3ceded1d7719fa71402b",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 4226,
"upload_time": "2024-10-20T21:55:35",
"upload_time_iso_8601": "2024-10-20T21:55:35.406377Z",
"url": "https://files.pythonhosted.org/packages/8c/43/d6be6327602d7b519e03b13a01087641daf92ea4c7ea31e184ac149a2fea/pyspark_prometheus-0.1.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-20 21:55:35",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "zcking",
"github_project": "pyspark_prometheus",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [],
"lcname": "pyspark-prometheus"
}