happytransformer


Namehappytransformer JSON
Version 3.0.0 PyPI version JSON
download
home_pagehttps://github.com/EricFillion/happy-transformer
SummaryHappy Transformer makes it easy to fine-tune NLP Transformer models and use them for inference.
upload_time2023-08-05 22:54:04
maintainer
docs_urlNone
authorThe Happy Transformer Development Team
requires_python
licenseApache 2.0
keywords bert roberta ai transformer happy happytransformer classification nlp nlu natural language processing understanding
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0) 
[![Downloads](https://pepy.tech/badge/happytransformer)](https://pepy.tech/project/happytransformer)
[![Website shields.io](https://img.shields.io/website-up-down-green-red/http/shields.io.svg)](http://happytransformer.com)
![PyPI](https://img.shields.io/pypi/v/happytransformer)
[![](https://github.com/EricFillion/happy-transformer/workflows/build/badge.svg)](https://github.com/EricFillion/happy-transformer/actions)

# Happy Transformer 
**Documentation and news: [happytransformer.com](http://happytransformer.com)**



Join our Discord server: [![Support Server](https://img.shields.io/discord/839263772312862740.svg?label=Discord&logo=Discord&colorB=7289da&style=?style=flat-square&logo=appveyor)](https://discord.gg/psVwe3wfTb)



![HappyTransformer](logo.png)

Happy Transformer makes it easy to fine-tune NLP Transformer models and use them for inference. 

## 3.0.0 
1. Deepspeed for training 
2. Apple's MPS for training and inference 
3. WandB to track training runs 
4. Data supplied for training is automatically split into portions for training and evaluating
5. Push models directly to Hugging Face's Model Hub

Read about the full 3.0.0 update including breaking changes [here](https://happytransformer.com/news/). 


## Tasks 
  
| Tasks                    | Inference | Training   |
|--------------------------|-----------|------------|
| Text Generation          | ✔         | ✔          |
| Text Classification      | ✔         | ✔          | 
| Word Prediction          | ✔         | ✔          |
| Question Answering       | ✔         | ✔          | 
| Text-to-Text             | ✔         | ✔          | 
| Next Sentence Prediction | ✔         |            | 
| Token Classification     | ✔         |            | 

## Quick Start
```sh
pip install happytransformer
```

```python

from happytransformer import HappyWordPrediction
#--------------------------------------#
happy_wp = HappyWordPrediction()  # default uses distilbert-base-uncased
result = happy_wp.predict_mask("I think therefore I [MASK]")
print(result)  # [WordPredictionResult(token='am', score=0.10172799974679947)]
print(result[0].token)  # am
```

## Maintainers
- [Eric Fillion](https://github.com/ericfillion)  Lead Maintainer
- [Ted Brownlow](https://github.com/ted537) Maintainer


## Tutorials 
[Text generation with training (GPT-Neo)](https://youtu.be/GzHJ3NUVtV4)

[Text classification (training)](https://www.vennify.ai/train-text-classification-transformers/) 

[Text classification (hate speech detection)](https://youtu.be/jti2sPQYzeQ) 

[Text classification (sentiment analysis)](https://youtu.be/Ew72EAgM7FM)

[Word prediction with training (DistilBERT, RoBERTa)](https://youtu.be/AWe0PHsPc_M)

[Top T5 Models ](https://www.vennify.ai/top-t5-transformer-models/)

[Grammar Correction](https://www.vennify.ai/grammar-correction-python/)

[Fine-tune a Grammar Correction Model](https://www.vennify.ai/fine-tune-grammar-correction/)



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/EricFillion/happy-transformer",
    "name": "happytransformer",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "bert,roberta,ai,transformer,happy,HappyTransformer,classification,nlp,nlu,natural,language,processing,understanding",
    "author": "The Happy Transformer Development Team",
    "author_email": "happytransformer@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/0a/f5/77b99cdfc9ff49866273c492ee05a50dad36d03e00f65412842a7f2cd6df/happytransformer-3.0.0.tar.gz",
    "platform": null,
    "description": "[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0) \n[![Downloads](https://pepy.tech/badge/happytransformer)](https://pepy.tech/project/happytransformer)\n[![Website shields.io](https://img.shields.io/website-up-down-green-red/http/shields.io.svg)](http://happytransformer.com)\n![PyPI](https://img.shields.io/pypi/v/happytransformer)\n[![](https://github.com/EricFillion/happy-transformer/workflows/build/badge.svg)](https://github.com/EricFillion/happy-transformer/actions)\n\n# Happy Transformer \n**Documentation and news: [happytransformer.com](http://happytransformer.com)**\n\n\n\nJoin our Discord server: [![Support Server](https://img.shields.io/discord/839263772312862740.svg?label=Discord&logo=Discord&colorB=7289da&style=?style=flat-square&logo=appveyor)](https://discord.gg/psVwe3wfTb)\n\n\n\n![HappyTransformer](logo.png)\n\nHappy Transformer makes it easy to fine-tune NLP Transformer models and use them for inference. \n\n## 3.0.0 \n1. Deepspeed for training \n2. Apple's MPS for training and inference \n3. WandB to track training runs \n4. Data supplied for training is automatically split into portions for training and evaluating\n5. Push models directly to Hugging Face's Model Hub\n\nRead about the full 3.0.0 update including breaking changes [here](https://happytransformer.com/news/). \n\n\n## Tasks \n  \n| Tasks                    | Inference | Training   |\n|--------------------------|-----------|------------|\n| Text Generation          | \u2714         | \u2714          |\n| Text Classification      | \u2714         | \u2714          | \n| Word Prediction          | \u2714         | \u2714          |\n| Question Answering       | \u2714         | \u2714          | \n| Text-to-Text             | \u2714         | \u2714          | \n| Next Sentence Prediction | \u2714         |            | \n| Token Classification     | \u2714         |            | \n\n## Quick Start\n```sh\npip install happytransformer\n```\n\n```python\n\nfrom happytransformer import HappyWordPrediction\n#--------------------------------------#\nhappy_wp = HappyWordPrediction()  # default uses distilbert-base-uncased\nresult = happy_wp.predict_mask(\"I think therefore I [MASK]\")\nprint(result)  # [WordPredictionResult(token='am', score=0.10172799974679947)]\nprint(result[0].token)  # am\n```\n\n## Maintainers\n- [Eric Fillion](https://github.com/ericfillion)  Lead Maintainer\n- [Ted Brownlow](https://github.com/ted537) Maintainer\n\n\n## Tutorials \n[Text generation with training (GPT-Neo)](https://youtu.be/GzHJ3NUVtV4)\n\n[Text classification (training)](https://www.vennify.ai/train-text-classification-transformers/) \n\n[Text classification (hate speech detection)](https://youtu.be/jti2sPQYzeQ) \n\n[Text classification (sentiment analysis)](https://youtu.be/Ew72EAgM7FM)\n\n[Word prediction with training (DistilBERT, RoBERTa)](https://youtu.be/AWe0PHsPc_M)\n\n[Top T5 Models ](https://www.vennify.ai/top-t5-transformer-models/)\n\n[Grammar Correction](https://www.vennify.ai/grammar-correction-python/)\n\n[Fine-tune a Grammar Correction Model](https://www.vennify.ai/fine-tune-grammar-correction/)\n\n\n",
    "bugtrack_url": null,
    "license": "Apache 2.0",
    "summary": "Happy Transformer makes it easy to fine-tune NLP Transformer models and use them for inference.",
    "version": "3.0.0",
    "project_urls": {
        "Homepage": "https://github.com/EricFillion/happy-transformer"
    },
    "split_keywords": [
        "bert",
        "roberta",
        "ai",
        "transformer",
        "happy",
        "happytransformer",
        "classification",
        "nlp",
        "nlu",
        "natural",
        "language",
        "processing",
        "understanding"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f6ed8abe77d280294a454534003242431439b102cf17e1089fde6020f90ab621",
                "md5": "9a3a8e09dd1f686a8c65aaf4320474f9",
                "sha256": "30e01622603ae191f5febe252a01ba28a6eac9edc3ad123010f667657dfe47d2"
            },
            "downloads": -1,
            "filename": "happytransformer-3.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9a3a8e09dd1f686a8c65aaf4320474f9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 24697,
            "upload_time": "2023-08-05T22:54:02",
            "upload_time_iso_8601": "2023-08-05T22:54:02.602887Z",
            "url": "https://files.pythonhosted.org/packages/f6/ed/8abe77d280294a454534003242431439b102cf17e1089fde6020f90ab621/happytransformer-3.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0af577b99cdfc9ff49866273c492ee05a50dad36d03e00f65412842a7f2cd6df",
                "md5": "1476457af778d2d6d27c7aeaabe8081b",
                "sha256": "f12e5e03da5f1c10317408e5bfcaccb7ffbf3ab192cb89a56f91c0f70545a0bb"
            },
            "downloads": -1,
            "filename": "happytransformer-3.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "1476457af778d2d6d27c7aeaabe8081b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 19173,
            "upload_time": "2023-08-05T22:54:04",
            "upload_time_iso_8601": "2023-08-05T22:54:04.068196Z",
            "url": "https://files.pythonhosted.org/packages/0a/f5/77b99cdfc9ff49866273c492ee05a50dad36d03e00f65412842a7f2cd6df/happytransformer-3.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-05 22:54:04",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "EricFillion",
    "github_project": "happy-transformer",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "happytransformer"
}
        
Elapsed time: 2.38965s