tune-the-model


Nametune-the-model JSON
Version 0.1.33 PyPI version JSON
download
home_pagehttps://github.com/tune-the-model/tune-the-model-py
Summarybeyondml
upload_time2023-03-30 12:58:21
maintainer
docs_urlNone
authorBeyondML
requires_python
licenseMIT
keywords beyondml tune-the-model gpt-3 nlp
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Tune The Model python wrapper

[Tune The Model](https://tunethemodel.com) is a few-shot AutoML system.

Tune The Model can do almost anything that requires understanding or generating natural language. It is able to solve tasks in 12 languages: English, Spanish, Portuguese, Russian, Turkish, French, German, Italian, Arabic, Polish, Dutch, and Hebrew.

This package provides a simple wrapper for using our api.

Using `tune-the-model` package allows you to train and apply models.

## Documentation

You can find the documentation at our [Tune The Model API docs site](https://tune-the-model.github.io/tune-the-model-docs/index.html).

## Just try

We have fine-tuned several models. You can use the [notebook](https://colab.research.google.com/github/beyondml/model-one-py/blob/main/playbook.ipynb) to try them out. You can [get the token](https://tunethemodel.com) to fine tune your own model.

## Getting started

Firstly fill out the [form](https://tunethemodel.com) to get a key to access the API. We will send you the key within a day.

### Installation

To install the package just use `pip install -U tune-the-model`.

### Usage

```py
import tune_the_model as ttm
import pandas as pd

ttm.set_api_key('YOUR_API_KEY')

# load datasets
tdf = pd.read_csv('train.csv')
vdf = pd.read_csv('test.csv')

# Call one method. It will do everything for you:
# create a model, save it to the file, upload datasets and put the model in the queue for training.
model = ttm.tune_generator(
    'filename.json',
    tdf['inputs'], tdf['outputs'],
    vdf['inputs'], vdf['outputs']
)

# wait...
# a few hours
# while our GPUs train your model
model.wait_for_training_finish()

print(model.status)
print(model.is_ready)

# inference!
the_answer = model.generate('The Answer to the Ultimate Question of Life, the Universe, and Everything')
print(the_answer)
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/tune-the-model/tune-the-model-py",
    "name": "tune-the-model",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "beyondml,tune-the-model,gpt-3,nlp",
    "author": "BeyondML",
    "author_email": "pavel.gavrilov@beyond.ml",
    "download_url": "https://files.pythonhosted.org/packages/5f/ba/52259d21fe2ee021ca557101ff267184ab49dc8a81bd3383c798948f4263/tune_the_model-0.1.33.tar.gz",
    "platform": null,
    "description": "# Tune The Model python wrapper\n\n[Tune The Model](https://tunethemodel.com) is a few-shot AutoML system.\n\nTune The Model can do almost anything that requires understanding or generating natural language. It is able to solve tasks in 12 languages: English, Spanish, Portuguese, Russian, Turkish, French, German, Italian, Arabic, Polish, Dutch, and Hebrew.\n\nThis package provides a simple wrapper for using our api.\n\nUsing `tune-the-model` package allows you to train and apply models.\n\n## Documentation\n\nYou can find the documentation at our [Tune The Model API docs site](https://tune-the-model.github.io/tune-the-model-docs/index.html).\n\n## Just try\n\nWe have fine-tuned several models. You can use the [notebook](https://colab.research.google.com/github/beyondml/model-one-py/blob/main/playbook.ipynb) to try them out. You can [get the token](https://tunethemodel.com) to fine tune your own model.\n\n## Getting started\n\nFirstly fill out the [form](https://tunethemodel.com) to get a key to access the API. We will send you the key within a day.\n\n### Installation\n\nTo install the package just use `pip install -U tune-the-model`.\n\n### Usage\n\n```py\nimport tune_the_model as ttm\nimport pandas as pd\n\nttm.set_api_key('YOUR_API_KEY')\n\n# load datasets\ntdf = pd.read_csv('train.csv')\nvdf = pd.read_csv('test.csv')\n\n# Call one method. It will do everything for you:\n# create a model, save it to the file, upload datasets and put the model in the queue for training.\nmodel = ttm.tune_generator(\n    'filename.json',\n    tdf['inputs'], tdf['outputs'],\n    vdf['inputs'], vdf['outputs']\n)\n\n# wait...\n# a few hours\n# while our GPUs train your model\nmodel.wait_for_training_finish()\n\nprint(model.status)\nprint(model.is_ready)\n\n# inference!\nthe_answer = model.generate('The Answer to the Ultimate Question of Life, the Universe, and Everything')\nprint(the_answer)\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "beyondml",
    "version": "0.1.33",
    "split_keywords": [
        "beyondml",
        "tune-the-model",
        "gpt-3",
        "nlp"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "81397b985ef5f75544359290768cdfd7b8df07d4f534844a516eb1512d128b5a",
                "md5": "cef60c6990d1241ea211ee96c67f9793",
                "sha256": "4894ac4d97d77d10f7e30e11cb246205c980b54aea9ea79c5e451c2214b883cb"
            },
            "downloads": -1,
            "filename": "tune_the_model-0.1.33-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "cef60c6990d1241ea211ee96c67f9793",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 15437,
            "upload_time": "2023-03-30T12:58:18",
            "upload_time_iso_8601": "2023-03-30T12:58:18.100371Z",
            "url": "https://files.pythonhosted.org/packages/81/39/7b985ef5f75544359290768cdfd7b8df07d4f534844a516eb1512d128b5a/tune_the_model-0.1.33-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5fba52259d21fe2ee021ca557101ff267184ab49dc8a81bd3383c798948f4263",
                "md5": "67778bf510ad477f51c574c4f4f5d2c4",
                "sha256": "cce80eeb6f998c686ecb7f7e9c12d333dbcbc075052ad5339af29cd22d9de805"
            },
            "downloads": -1,
            "filename": "tune_the_model-0.1.33.tar.gz",
            "has_sig": false,
            "md5_digest": "67778bf510ad477f51c574c4f4f5d2c4",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 14814,
            "upload_time": "2023-03-30T12:58:21",
            "upload_time_iso_8601": "2023-03-30T12:58:21.086443Z",
            "url": "https://files.pythonhosted.org/packages/5f/ba/52259d21fe2ee021ca557101ff267184ab49dc8a81bd3383c798948f4263/tune_the_model-0.1.33.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-03-30 12:58:21",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "tune-the-model",
    "github_project": "tune-the-model-py",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "tune-the-model"
}
        
Elapsed time: 0.04802s