stop-sequencer


Namestop-sequencer JSON
Version 1.2.3 PyPI version JSON
download
home_pagehttps://github.com/hyunwoongko/stop-sequencer
SummaryImplementation of stop sequencer for Huggingface Transformers
upload_time2023-06-06 13:35:53
maintainer
docs_urlNone
authorHyunwoong Ko
requires_python>=3
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Stop Sequencer
- Implementation of stop sequencer for Huggingface Transformers.
- Note post-processing must be used together because limitation of transformers implementation.
<br><br>
  
## 1. Installation
```console
pip install stop-sequencer
```
<br>

## 2. Usage
### 2.1. Generation without StopSequencer
```python
from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("gpt2")
tokenizer = AutoTokenizer.from_pretrained("gpt2")
tokens = tokenizer(
    "Kevin: Hello "
    "Ryan: Hi "
    "Kevin: What are you doing? "
    "Ryan: I am watching TV. you? "
    "Kevin: ",
    return_tensors="pt",
)["input_ids"]

outputs = model.generate(
    tokens,
    num_beams=5,
    no_repeat_ngram_size=4,
    repetition_penalty=1.5,
    max_length=100,
)

outputs = tokenizer.batch_decode(outputs[:, tokens.size(-1):], skip_special_tokens=True)[0]
print(outputs)
```
```
ive been watching TV for a long time. Ryan: I have been watching TV since I was 12 years old. Kevin: So what do you want me to do? Ryan: Well, I want you to watch TV. You know what I mean? I'm going to be watching TV. I'm not going to sit down and watch TV. I don't want to
```
<br><br>

### 2.2. Generation with StopSequencer

```python
from stop_sequencer import StopSequencer

stop_texts = ["Ryan:", "Kevin:"]

stop_sequencer = StopSequencer(
    model,
    model_type="causal",  # or seq2seq
    tokenizer=tokenizer,
)

model = stop_sequencer.register_stop_texts(
    stop_texts=stop_texts,
    input_length=tokens.size(-1),
)

outputs = model.generate(
    tokens,
    num_beams=5,
    no_repeat_ngram_size=4,
    repetition_penalty=1.5,
    max_length=100,
)

outputs = tokenizer.batch_decode(outputs[:, tokens.size(-1):], skip_special_tokens=True)[0]
print(outputs)
```
```
ive been watching TV for a long time. Ryan: I have
```
You can see that `Ryan: I have` is contained in the generation result and then generation is finished. The generation can be terminated after stop text (`Ryan:`) is generated because of the limitation of Huggingface Transformers.
<br><br>

### 3. Generation with StopSequencer + post-processing
Therefore, post-processing must be performed to completely exclude stop texts from generated text.
```python
for s in stop_texts:
    outputs = outputs.split(s)[0].strip()
    
print(outputs)
```
```
ive been watching TV for a long time.
```
<br><br>

## License
```
Copyright 2021 Hyunwoong Ko.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/hyunwoongko/stop-sequencer",
    "name": "stop-sequencer",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3",
    "maintainer_email": "",
    "keywords": "",
    "author": "Hyunwoong Ko",
    "author_email": "gusdnd852@naver.com",
    "download_url": "https://files.pythonhosted.org/packages/ec/23/f76e528167d4e98bdcb1a96c311de6a55817c1d93206cbb3173d56637111/stop-sequencer-1.2.3.tar.gz",
    "platform": null,
    "description": "# Stop Sequencer\n- Implementation of stop sequencer for Huggingface Transformers.\n- Note post-processing must be used together because limitation of transformers implementation.\n<br><br>\n  \n## 1. Installation\n```console\npip install stop-sequencer\n```\n<br>\n\n## 2. Usage\n### 2.1. Generation without StopSequencer\n```python\nfrom transformers import AutoModelForCausalLM, AutoTokenizer\n\nmodel = AutoModelForCausalLM.from_pretrained(\"gpt2\")\ntokenizer = AutoTokenizer.from_pretrained(\"gpt2\")\ntokens = tokenizer(\n    \"Kevin: Hello \"\n    \"Ryan: Hi \"\n    \"Kevin: What are you doing? \"\n    \"Ryan: I am watching TV. you? \"\n    \"Kevin: \",\n    return_tensors=\"pt\",\n)[\"input_ids\"]\n\noutputs = model.generate(\n    tokens,\n    num_beams=5,\n    no_repeat_ngram_size=4,\n    repetition_penalty=1.5,\n    max_length=100,\n)\n\noutputs = tokenizer.batch_decode(outputs[:, tokens.size(-1):], skip_special_tokens=True)[0]\nprint(outputs)\n```\n```\nive been watching TV for a long time. Ryan: I have been watching TV since I was 12 years old. Kevin: So what do you want me to do? Ryan: Well, I want you to watch TV. You know what I mean? I'm going to be watching TV. I'm not going to sit down and watch TV. I don't want to\n```\n<br><br>\n\n### 2.2. Generation with StopSequencer\n\n```python\nfrom stop_sequencer import StopSequencer\n\nstop_texts = [\"Ryan:\", \"Kevin:\"]\n\nstop_sequencer = StopSequencer(\n    model,\n    model_type=\"causal\",  # or seq2seq\n    tokenizer=tokenizer,\n)\n\nmodel = stop_sequencer.register_stop_texts(\n    stop_texts=stop_texts,\n    input_length=tokens.size(-1),\n)\n\noutputs = model.generate(\n    tokens,\n    num_beams=5,\n    no_repeat_ngram_size=4,\n    repetition_penalty=1.5,\n    max_length=100,\n)\n\noutputs = tokenizer.batch_decode(outputs[:, tokens.size(-1):], skip_special_tokens=True)[0]\nprint(outputs)\n```\n```\nive been watching TV for a long time. Ryan: I have\n```\nYou can see that `Ryan: I have` is contained in the generation result and then generation is finished. The generation can be terminated after stop text (`Ryan:`) is generated because of the limitation of Huggingface Transformers.\n<br><br>\n\n### 3. Generation with StopSequencer + post-processing\nTherefore, post-processing must be performed to completely exclude stop texts from generated text.\n```python\nfor s in stop_texts:\n    outputs = outputs.split(s)[0].strip()\n    \nprint(outputs)\n```\n```\nive been watching TV for a long time.\n```\n<br><br>\n\n## License\n```\nCopyright 2021 Hyunwoong Ko.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\nhttp://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n```\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Implementation of stop sequencer for Huggingface Transformers",
    "version": "1.2.3",
    "project_urls": {
        "Homepage": "https://github.com/hyunwoongko/stop-sequencer"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ec23f76e528167d4e98bdcb1a96c311de6a55817c1d93206cbb3173d56637111",
                "md5": "ba7b1760cf678b22a3efba037b31b900",
                "sha256": "2eb719ff4cb57a79cf0ddc87b4bf7da11fc9e7a170005eaed17311b015b71d02"
            },
            "downloads": -1,
            "filename": "stop-sequencer-1.2.3.tar.gz",
            "has_sig": false,
            "md5_digest": "ba7b1760cf678b22a3efba037b31b900",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3",
            "size": 3671,
            "upload_time": "2023-06-06T13:35:53",
            "upload_time_iso_8601": "2023-06-06T13:35:53.744830Z",
            "url": "https://files.pythonhosted.org/packages/ec/23/f76e528167d4e98bdcb1a96c311de6a55817c1d93206cbb3173d56637111/stop-sequencer-1.2.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-06 13:35:53",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "hyunwoongko",
    "github_project": "stop-sequencer",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "stop-sequencer"
}
        
Elapsed time: 2.16318s