mlflow-slurm


Namemlflow-slurm JSON
Version 1.0.3 PyPI version JSON
download
home_pagehttps://github.com/ncsa/mlflow-slurm
SummaryBackend implementation for running MLFlow projects on Slurm
upload_time2024-01-31 17:34:26
maintainerBen Galewsky
docs_urlNone
author
requires_python>=3.6
license
keywords mlflow
VCS
bugtrack_url
requirements mlflow
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # MLFlow-Slurm
Backend for executing MLFlow projects on Slurm batch system

## Usage
Install this package in the environment from which you will be submitting jobs.
If you are submitting jobs from inside jobs, make sure you have this package 
listed in your conda or pip environment.

Just list this as your `--backend` in the job run. You should include a json 
config file to control how the batch script is constructed:
```shell
mlflow run --backend slurm \
          --backend-config slurm_config.json \
          examples/sklearn_elasticnet_wine
```

It will generate a batch script named after the job id and submit it via the
Slurm `sbatch` command. It will tag the run with the Slurm JobID

## Configure Jobs
You can set values in a json file to control job submission. The supported
properties in this file are:

|Config File Setting| Use                                                                                                            |
|-------------------|----------------------------------------------------------------------------------------------------------------|
|partition          | Which Slurm partition should the job run in?                                                                   |
 |account            | What account name to run under                                                                                 |
| gpus_per_node     | On GPU partitions how many GPUs to allocate per node                                                           |
| gres              | SLURM Generic RESources requests                                                                               |
| mem               | Amount of memory to allocate to CPU jobs                                                                       |
| modules           | List of modules to load before starting job                                                                    |
| time              | Max CPU time job may run                                                                                       |
| sbatch-script-file | Name of batch file to be produced. Leave blank to have service generate a script file name based on the run ID |

## Development
The slurm docker deployment is handy for testing and development. You can start
up a slurm environment with the included docker-compose file


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/ncsa/mlflow-slurm",
    "name": "mlflow-slurm",
    "maintainer": "Ben Galewsky",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "bengal1@illinois.edu",
    "keywords": "mlflow",
    "author": "",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/aa/c7/6af9f93e5ba55bbd328fbb0d04390462ba9799936e19df08a7e084445d1d/mlflow_slurm-1.0.3.tar.gz",
    "platform": null,
    "description": "# MLFlow-Slurm\nBackend for executing MLFlow projects on Slurm batch system\n\n## Usage\nInstall this package in the environment from which you will be submitting jobs.\nIf you are submitting jobs from inside jobs, make sure you have this package \nlisted in your conda or pip environment.\n\nJust list this as your `--backend` in the job run. You should include a json \nconfig file to control how the batch script is constructed:\n```shell\nmlflow run --backend slurm \\\n          --backend-config slurm_config.json \\\n          examples/sklearn_elasticnet_wine\n```\n\nIt will generate a batch script named after the job id and submit it via the\nSlurm `sbatch` command. It will tag the run with the Slurm JobID\n\n## Configure Jobs\nYou can set values in a json file to control job submission. The supported\nproperties in this file are:\n\n|Config File Setting| Use                                                                                                            |\n|-------------------|----------------------------------------------------------------------------------------------------------------|\n|partition          | Which Slurm partition should the job run in?                                                                   |\n |account            | What account name to run under                                                                                 |\n| gpus_per_node     | On GPU partitions how many GPUs to allocate per node                                                           |\n| gres              | SLURM Generic RESources requests                                                                               |\n| mem               | Amount of memory to allocate to CPU jobs                                                                       |\n| modules           | List of modules to load before starting job                                                                    |\n| time              | Max CPU time job may run                                                                                       |\n| sbatch-script-file | Name of batch file to be produced. Leave blank to have service generate a script file name based on the run ID |\n\n## Development\nThe slurm docker deployment is handy for testing and development. You can start\nup a slurm environment with the included docker-compose file\n\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Backend implementation for running MLFlow projects on Slurm",
    "version": "1.0.3",
    "project_urls": {
        "Homepage": "https://github.com/ncsa/mlflow-slurm"
    },
    "split_keywords": [
        "mlflow"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "72e25812a45687098e017a4f42ce741e017fd56426795cf3cc6a429b593f2bd1",
                "md5": "5ae243c44d1fc44c5442f92aa41d0190",
                "sha256": "4c66b32feadae9423024ebc24e9864a9023bb5113bfe1f965d95f81334c110c8"
            },
            "downloads": -1,
            "filename": "mlflow_slurm-1.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "5ae243c44d1fc44c5442f92aa41d0190",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 8213,
            "upload_time": "2024-01-31T17:34:24",
            "upload_time_iso_8601": "2024-01-31T17:34:24.910504Z",
            "url": "https://files.pythonhosted.org/packages/72/e2/5812a45687098e017a4f42ce741e017fd56426795cf3cc6a429b593f2bd1/mlflow_slurm-1.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "aac76af9f93e5ba55bbd328fbb0d04390462ba9799936e19df08a7e084445d1d",
                "md5": "61b16e34832450ab6eb84704798e2f8e",
                "sha256": "16f177db6df5946c5a7d03666a008a24097b94bf5abfad80b3149a3dfbb3b1b7"
            },
            "downloads": -1,
            "filename": "mlflow_slurm-1.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "61b16e34832450ab6eb84704798e2f8e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 6634,
            "upload_time": "2024-01-31T17:34:26",
            "upload_time_iso_8601": "2024-01-31T17:34:26.447301Z",
            "url": "https://files.pythonhosted.org/packages/aa/c7/6af9f93e5ba55bbd328fbb0d04390462ba9799936e19df08a7e084445d1d/mlflow_slurm-1.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-31 17:34:26",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ncsa",
    "github_project": "mlflow-slurm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "mlflow",
            "specs": []
        }
    ],
    "lcname": "mlflow-slurm"
}
        
Elapsed time: 0.17339s