# GROUPED SAMPLING LIBRARY
This is a library for generating text using a causal language model.
This library is using the grouped sampling algorithm.
This library supports models from huggingface hub that follow the causal language model architecture.
## Installation
```pip install grouped-sampling```
## Usage
1. Make sure You have python 3 (3.9+ is recommended), fast internet connection and the grouped sampling library.
2. Choose a Causal Language Model from huggingface hub
3. Choose a group size, which should be upper limit for the length of the generated texts.
A higher group size will cause unnecessary computations.
A lower group size will cause lower performance both in runtime and text quality.
4. `pipe = GroupedSamplingPipeLine(model_name=YOUR_MODEL_NAME, group_size=YOUR_GROUP_SIZE)`
5. `answer = pipe(YOUR_TEXT)["generated_text"]`
Raw data
{
"_id": null,
"home_page": "",
"name": "grouped-sampling",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": "",
"keywords": "GPT,NLP,cuasal-language-model,cuasal-text-generation,gpt,language-model,natural-language-processing,nlp,text-generation",
"author": "",
"author_email": "Yoni Kremer <yoni.kremer@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/21/6b/4e5c695763b494ae71100c80e647341b310c1788ad6e3b461b73cdee5689/grouped_sampling-1.0.20.tar.gz",
"platform": null,
"description": "# GROUPED SAMPLING LIBRARY\n\nThis is a library for generating text using a causal language model.\n\nThis library is using the grouped sampling algorithm.\n\nThis library supports models from huggingface hub that follow the causal language model architecture.\n\n## Installation\n\n```pip install grouped-sampling```\n\n## Usage\n\n1. Make sure You have python 3 (3.9+ is recommended), fast internet connection and the grouped sampling library.\n2. Choose a Causal Language Model from huggingface hub\n3. Choose a group size, which should be upper limit for the length of the generated texts.\nA higher group size will cause unnecessary computations.\nA lower group size will cause lower performance both in runtime and text quality.\n4. `pipe = GroupedSamplingPipeLine(model_name=YOUR_MODEL_NAME, group_size=YOUR_GROUP_SIZE)`\n5. `answer = pipe(YOUR_TEXT)[\"generated_text\"]`",
"bugtrack_url": null,
"license": "",
"summary": "A faster algorithm for generating texts with cuasal language models",
"version": "1.0.20",
"split_keywords": [
"gpt",
"nlp",
"cuasal-language-model",
"cuasal-text-generation",
"gpt",
"language-model",
"natural-language-processing",
"nlp",
"text-generation"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "216b4e5c695763b494ae71100c80e647341b310c1788ad6e3b461b73cdee5689",
"md5": "310bb460102a994be37ffb114e019f55",
"sha256": "8beb55b3bcd0e1d13585343cc1871293c483c7f556013193056bc0787ec46094"
},
"downloads": -1,
"filename": "grouped_sampling-1.0.20.tar.gz",
"has_sig": false,
"md5_digest": "310bb460102a994be37ffb114e019f55",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 31719,
"upload_time": "2023-03-11T15:52:52",
"upload_time_iso_8601": "2023-03-11T15:52:52.910918Z",
"url": "https://files.pythonhosted.org/packages/21/6b/4e5c695763b494ae71100c80e647341b310c1788ad6e3b461b73cdee5689/grouped_sampling-1.0.20.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-03-11 15:52:52",
"github": false,
"gitlab": false,
"bitbucket": false,
"lcname": "grouped-sampling"
}