tributaries-ml


Nametributaries-ml JSON
Version 1.0.5 PyPI version JSON
download
home_pagehttps://github.com/agi-init/tributaries
SummaryA library for mass-deploying UnifiedML apps on slurm-enabled servers.
upload_time2024-03-07 13:29:16
maintainer
docs_urlNone
authorSam Lerman
requires_python>=3.10.8
license
keywords slurm job scheduling drm artificial intelligence machine learning deep learning reinforcement learning image classification
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Tributaries

A library for mass-deploying [UnifiedML](https://www.github.com/agi-init/UnifiedML) apps on [slurm](https://en.wikipedia.org/wiki/Slurm_Workload_Manager)-enabled remote servers.

```console
pip install tributaries-ml
```

[Examples](Examples)

### Server

Simply create and run a python file with a server configuration like this one:

```python
# MyServer.py

from tributaries import my_server


@my_server(sweep='path/to/my/sweep.py')
def main():
    ...
    return server, username, password, func, app_name_paths, commands, sbatch


if __name__ == '__main__':
    main()
```

That method must return the ```server```, ```username```, and ```password```.

Optionally:
- Any additional ```func``` that needs to be run (e.g. [connecting to a VPN](VPN.py)).
- An ```app_name_paths``` dictionary of names and paths to any UnifiedML apps' run scripts you'd like to use, *e.g.* ```{'name_of_my_app': 'path/to/name_of_my_app/Run.py'}```, or leave this blank to use the remote server's root home directory and ```ML``` as the run script.
- A ```commands``` list or string of any extra environment-setup commands you may need to pass to the remote server command-line and deploy config such as [activating a conda environment for example](Examples/Servers/XuLab.py#L10).
- Any additional ```sbatch``` string text you'd like to add to the deploy config.

[You may use one of the blueprint server files provided.](Examples/Servers)

### Sweep

Note the Server decorator accepts a ```sweep=``` file path.

You may define a ```sweep``` file like this one:

```python
# path/to/my/sweep.py

from tributaries import my_sweep, my_plots, my_checkpoints

my_sweep.hyperparams = [
    # Hyperparam set 1
    '... experiment=Exp1',

    # Hyperparam set 2
    '... experiment=Exp2'
]

my_sweep.app = 'name_of_my_app'  # Corresponds to an app name in 'app_name_paths' of Server definition

# Logs to download
my_plots.plots = [['Exp1', 'Exp2']]  # Names of experiments to plot together in a single plot

my_checkpoints.experiments = ['Exp1', 'Exp2']  # Names of experiments to download checkpoints for
```

The ```my_sweep``` and ```my_plots``` toggles have [additional configurations](Sweeps.py) that can be used to further customize the launching and plots.

[See here for examples.](Examples/Sweeps) 

### Running

That's it. Running it via ```python MyServer.py``` will launch the corresponding sweep experiments on your remote server. Add the ```plot=true``` flag to instead download plots back down to your local machine.

Add ```checkpoints=true``` to download checkpoints.

#### Launching

```console
python MyServer.py
```

#### Plotting & Logs

```console
python MyServer.py plot=true
```

#### Checkpoints

```console
python MyServer.py checkpoints=true
```

[//]: # (Note: these hyperparams are already fully part of [UnifiedML](github.com/agi-init/UnifiedML), together with the ```my_server=``` server-path flag for pointing to a server file, *e.g.*, ```ML my_server=MyServer.main``` can launch and plot the above directly from [UnifiedML](github.com/agi-init/UnifiedML)! )

### Extra

Note: Tributaries launching fully works for non-UnifiedML apps too. Also, for convenience, ```tributaries hyperparams='...' app='run.py'``` can be used as a general slurm launcher on your remote servers.

One more thing: if your remote UnifiedML apps are [git-ssh enabled](https://docs.github.com/en/authentication/connecting-to-github-with-ssh), Tributaries will automatically try syncing with the latest branch via a git pull. You can disable automatic GitHub-syncing with the ```github=false``` flag.

#

[Licensed under the MIT license.](MIT_LICENSE)

<img width="10%" alt="tributaries-logo" src="https://github.com/AGI-init/Assets/assets/92597756/7e7bb054-f265-4f53-a4f2-d3af52f1d890">

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/agi-init/tributaries",
    "name": "tributaries-ml",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10.8",
    "maintainer_email": "",
    "keywords": "slurm,job scheduling,DRM,artificial intelligence,machine learning,deep learning,reinforcement learning,image classification",
    "author": "Sam Lerman",
    "author_email": "agi.init@gmail.com",
    "download_url": "",
    "platform": null,
    "description": "# Tributaries\n\nA library for mass-deploying [UnifiedML](https://www.github.com/agi-init/UnifiedML) apps on [slurm](https://en.wikipedia.org/wiki/Slurm_Workload_Manager)-enabled remote servers.\n\n```console\npip install tributaries-ml\n```\n\n[Examples](Examples)\n\n### Server\n\nSimply create and run a python file with a server configuration like this one:\n\n```python\n# MyServer.py\n\nfrom tributaries import my_server\n\n\n@my_server(sweep='path/to/my/sweep.py')\ndef main():\n    ...\n    return server, username, password, func, app_name_paths, commands, sbatch\n\n\nif __name__ == '__main__':\n    main()\n```\n\nThat method must return the ```server```, ```username```, and ```password```.\n\nOptionally:\n- Any additional ```func``` that needs to be run (e.g. [connecting to a VPN](VPN.py)).\n- An ```app_name_paths``` dictionary of names and paths to any UnifiedML apps' run scripts you'd like to use, *e.g.* ```{'name_of_my_app': 'path/to/name_of_my_app/Run.py'}```, or leave this blank to use the remote server's root home directory and ```ML``` as the run script.\n- A ```commands``` list or string of any extra environment-setup commands you may need to pass to the remote server command-line and deploy config such as [activating a conda environment for example](Examples/Servers/XuLab.py#L10).\n- Any additional ```sbatch``` string text you'd like to add to the deploy config.\n\n[You may use one of the blueprint server files provided.](Examples/Servers)\n\n### Sweep\n\nNote the Server decorator accepts a ```sweep=``` file path.\n\nYou may define a ```sweep``` file like this one:\n\n```python\n# path/to/my/sweep.py\n\nfrom tributaries import my_sweep, my_plots, my_checkpoints\n\nmy_sweep.hyperparams = [\n    # Hyperparam set 1\n    '... experiment=Exp1',\n\n    # Hyperparam set 2\n    '... experiment=Exp2'\n]\n\nmy_sweep.app = 'name_of_my_app'  # Corresponds to an app name in 'app_name_paths' of Server definition\n\n# Logs to download\nmy_plots.plots = [['Exp1', 'Exp2']]  # Names of experiments to plot together in a single plot\n\nmy_checkpoints.experiments = ['Exp1', 'Exp2']  # Names of experiments to download checkpoints for\n```\n\nThe ```my_sweep``` and ```my_plots``` toggles have [additional configurations](Sweeps.py) that can be used to further customize the launching and plots.\n\n[See here for examples.](Examples/Sweeps) \n\n### Running\n\nThat's it. Running it via ```python MyServer.py``` will launch the corresponding sweep experiments on your remote server. Add the ```plot=true``` flag to instead download plots back down to your local machine.\n\nAdd ```checkpoints=true``` to download checkpoints.\n\n#### Launching\n\n```console\npython MyServer.py\n```\n\n#### Plotting & Logs\n\n```console\npython MyServer.py plot=true\n```\n\n#### Checkpoints\n\n```console\npython MyServer.py checkpoints=true\n```\n\n[//]: # (Note: these hyperparams are already fully part of [UnifiedML]&#40;github.com/agi-init/UnifiedML&#41;, together with the ```my_server=``` server-path flag for pointing to a server file, *e.g.*, ```ML my_server=MyServer.main``` can launch and plot the above directly from [UnifiedML]&#40;github.com/agi-init/UnifiedML&#41;! )\n\n### Extra\n\nNote: Tributaries launching fully works for non-UnifiedML apps too. Also, for convenience, ```tributaries hyperparams='...' app='run.py'``` can be used as a general slurm launcher on your remote servers.\n\nOne more thing: if your remote UnifiedML apps are [git-ssh enabled](https://docs.github.com/en/authentication/connecting-to-github-with-ssh), Tributaries will automatically try syncing with the latest branch via a git pull. You can disable automatic GitHub-syncing with the ```github=false``` flag.\n\n#\n\n[Licensed under the MIT license.](MIT_LICENSE)\n\n<img width=\"10%\" alt=\"tributaries-logo\" src=\"https://github.com/AGI-init/Assets/assets/92597756/7e7bb054-f265-4f53-a4f2-d3af52f1d890\">\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "A library for mass-deploying UnifiedML apps on slurm-enabled servers.",
    "version": "1.0.5",
    "project_urls": {
        "Bug Tracker": "https://github.com/agi-init/tributaries/issues",
        "Homepage": "https://github.com/agi-init/tributaries"
    },
    "split_keywords": [
        "slurm",
        "job scheduling",
        "drm",
        "artificial intelligence",
        "machine learning",
        "deep learning",
        "reinforcement learning",
        "image classification"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "75196035d812cad670474a98760886e2bce1cec4c0e5bd9de45946d340cbe720",
                "md5": "e47146cb81de7cb00737e338eeb646b4",
                "sha256": "0c60e7d64c379df4b96f428a958675797f07e001fe7f14409e3949aa66b47685"
            },
            "downloads": -1,
            "filename": "tributaries_ml-1.0.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e47146cb81de7cb00737e338eeb646b4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10.8",
            "size": 18638,
            "upload_time": "2024-03-07T13:29:16",
            "upload_time_iso_8601": "2024-03-07T13:29:16.283466Z",
            "url": "https://files.pythonhosted.org/packages/75/19/6035d812cad670474a98760886e2bce1cec4c0e5bd9de45946d340cbe720/tributaries_ml-1.0.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-07 13:29:16",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "agi-init",
    "github_project": "tributaries",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "tributaries-ml"
}
        
Elapsed time: 0.29898s