runcrawler


Nameruncrawler JSON
Version 0.0.2 PyPI version JSON
download
home_page
SummaryAVBP API
upload_time2021-05-04 10:07:19
maintainer
docs_urlNone
author
requires_python
license
keywords avbp api
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # RUNCRAWLER

**Runcrawler** is a monitoring tool on software-usage based on data extraction.

Parallel computing program such as CFD-solver often run on different clusters, by different users. It is hard for managers to get a big picture on how the program is used. **run crawler** is a tool to monitor their usage and allow managers to get a various type of useful information by extracting data related to the execution of the program.

Here are some examples of questions which managers often have. </br>
- What are the typical errors repeated by the users of my team ? </br>
- Are our runs efficient in terms of CPU time ? </br>
- If so, are they related to a certain parameter setting ? </br>

As an example, this chart pie shows the percentage of runs that are converged versus the runs
that did not converge and on the side, the code associated to the reason it 
crashed (0 if it exited fine, 300 if not in that case)


<p align="center"> 
    <img src=https://cerfacs.fr/coop/images/runcrawler/error_log.png width=300>
</p>



**Runcrawler** can return the error codes associated to crashes and improve 
your understanding.


## Installation
**runcrawler** is available by simple execution of

pip install runcrawler

## How to use

Use the CLI 

**runcrawler scan-log path/to-your-file**.

This will return a code corresponding to the reason why your run crashed.

## To go further

Read [further](https://cerfacs.fr/coop/runcrawler)





            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "runcrawler",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "AVBP API",
    "author": "",
    "author_email": "coop@cerfacs.fr",
    "download_url": "https://files.pythonhosted.org/packages/85/5a/ba194fe224bfce53a4e71a64022a9c01a6ee617adcd43080412b04c8f911/runcrawler-0.0.2.tar.gz",
    "platform": "",
    "description": "# RUNCRAWLER\n\n**Runcrawler** is a monitoring tool on software-usage based on data extraction.\n\nParallel computing program such as CFD-solver often run on different clusters, by different users. It is hard for managers to get a big picture on how the program is used. **run crawler** is a tool to monitor their usage and allow managers to get a various type of useful information by extracting data related to the execution of the program.\n\nHere are some examples of questions which managers often have. </br>\n- What are the typical errors repeated by the users of my team ? </br>\n- Are our runs efficient in terms of CPU time ? </br>\n- If so, are they related to a certain parameter setting ? </br>\n\nAs an example, this chart pie shows the percentage of runs that are converged versus the runs\nthat did not converge and on the side, the code associated to the reason it \ncrashed (0 if it exited fine, 300 if not in that case)\n\n\n<p align=\"center\"> \n    <img src=https://cerfacs.fr/coop/images/runcrawler/error_log.png width=300>\n</p>\n\n\n\n**Runcrawler** can return the error codes associated to crashes and improve \nyour understanding.\n\n\n## Installation\n**runcrawler** is available by simple execution of\n\npip install runcrawler\n\n## How to use\n\nUse the CLI \n\n**runcrawler scan-log path/to-your-file**.\n\nThis will return a code corresponding to the reason why your run crashed.\n\n## To go further\n\nRead [further](https://cerfacs.fr/coop/runcrawler)\n\n\n\n\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "AVBP API",
    "version": "0.0.2",
    "split_keywords": [
        "avbp",
        "api"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "md5": "0b6fb145b93c4aa6bb57901dc1d0cd20",
                "sha256": "4a96968538146ad3c8451e2f91975aa5243a607621e79c575ce127f0da377ec7"
            },
            "downloads": -1,
            "filename": "runcrawler-0.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0b6fb145b93c4aa6bb57901dc1d0cd20",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 5277,
            "upload_time": "2021-05-04T10:07:17",
            "upload_time_iso_8601": "2021-05-04T10:07:17.836864Z",
            "url": "https://files.pythonhosted.org/packages/5f/66/12f8a2a57e32c2d477712843192d80531fa0759f8662a370dd92d98a579d/runcrawler-0.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "md5": "c2691c9b9e70a87b62b70b417ae90987",
                "sha256": "bc1ebbc498ba8a5055b821da3e85be3e20e3421abed762ffafb1b62639c833cf"
            },
            "downloads": -1,
            "filename": "runcrawler-0.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "c2691c9b9e70a87b62b70b417ae90987",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 4813,
            "upload_time": "2021-05-04T10:07:19",
            "upload_time_iso_8601": "2021-05-04T10:07:19.249528Z",
            "url": "https://files.pythonhosted.org/packages/85/5a/ba194fe224bfce53a4e71a64022a9c01a6ee617adcd43080412b04c8f911/runcrawler-0.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2021-05-04 10:07:19",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "runcrawler"
}
        
Elapsed time: 0.26806s