Name | logits-processor-zoo JSON |
Version |
0.1.0
JSON |
| download |
home_page | None |
Summary | A collection of LogitsProcessors to customize and enhance LLM behavior for specific tasks. |
upload_time | 2024-11-17 10:23:54 |
maintainer | None |
docs_url | None |
author | Ahmet Erdem |
requires_python | <4.0,>=3.10 |
license | None |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
<p align="center">
<img src="docs/logo.jpg" width="50%">
</p>
# logits-processor-zoo
Struggling to get LLMs to follow your instructions? LogitsProcessorZoo offers a zoo of tools to use LLMs for specific tasks, beyond just grammar enforcement!
## Supported Frameworks
* transformers
* vLLM
* TensorRT-LLM
## Usage
```python
import vllm
from logits_processor_zoo.vllm import GenLengthLogitsProcessor, CiteFromPromptLogitsProcessor, ForceLastPhraseLogitsProcessor
model = vllm.LLM(
model_name,
trust_remote_code=True,
dtype="half",
enforce_eager=True
)
tokenizer = model.get_tokenizer()
logits_processors = [
CiteFromPromptLogitsProcessor(tokenizer, boost_factor=2.0),
GenLengthLogitsProcessor(tokenizer, boost_factor=-0.2, p=1),
ForceLastPhraseLogitsProcessor("\n\nReferences:\n", tokenizer)
]
gen_output = model.generate(
prompts,
vllm.SamplingParams(
n=1,
temperature=0,
seed=0,
skip_special_tokens=True,
max_tokens=64,
logits_processors=logits_processors
),
use_tqdm=False
)
```
For the detailed examples in each framework, please have a look at **example_notebook** directory.
## Available Logits Processors
### GenLengthLogitsProcessor
A logits processor that adjusts the likelihood of the end-of-sequence (EOS) token based on the length of the generated sequence, encouraging or discouraging shorter answers.
### CiteFromPromptLogitsProcessor
A logits processor which boosts or diminishes the likelihood of tokens present in the prompt (and optionally EOS token) to encourage the model to generate tokens similar to those seen in the prompt or vice versa.
### ForceLastPhraseLogitsProcessor
A logits processor which forces LLMs to use the given phrase before they finalize their answers. Most common use cases can be providing references, thanking user with context etc.
### MultipleChoiceLogitsProcessor
A logits processor to answer multiple choice questions with one of the choices. A multiple choice question is like:
```
I am getting a lot of calls during the day. What is more important for me to consider when I buy a new phone?
0. Camera
1. Screen resolution
2. Operating System
3. Battery
```
The goal is to make LLM generate "3" as an answer.
Raw data
{
"_id": null,
"home_page": null,
"name": "logits-processor-zoo",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": null,
"author": "Ahmet Erdem",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/73/19/2037379dda8e2e1cd75255cf30d68f8b7ab09264d9ef032b605ecc44156a/logits_processor_zoo-0.1.0.tar.gz",
"platform": null,
"description": "<p align=\"center\">\n <img src=\"docs/logo.jpg\" width=\"50%\">\n</p>\n\n# logits-processor-zoo\n\nStruggling to get LLMs to follow your instructions? LogitsProcessorZoo offers a zoo of tools to use LLMs for specific tasks, beyond just grammar enforcement!\n\n## Supported Frameworks\n* transformers\n* vLLM\n* TensorRT-LLM\n\n## Usage\n\n```python\nimport vllm\nfrom logits_processor_zoo.vllm import GenLengthLogitsProcessor, CiteFromPromptLogitsProcessor, ForceLastPhraseLogitsProcessor\n\nmodel = vllm.LLM(\n model_name,\n trust_remote_code=True,\n dtype=\"half\",\n enforce_eager=True\n )\ntokenizer = model.get_tokenizer()\n \nlogits_processors = [\n CiteFromPromptLogitsProcessor(tokenizer, boost_factor=2.0),\n GenLengthLogitsProcessor(tokenizer, boost_factor=-0.2, p=1),\n ForceLastPhraseLogitsProcessor(\"\\n\\nReferences:\\n\", tokenizer)\n]\n\n\ngen_output = model.generate(\n prompts,\n vllm.SamplingParams(\n n=1,\n temperature=0,\n seed=0,\n skip_special_tokens=True,\n max_tokens=64,\n logits_processors=logits_processors\n ),\n use_tqdm=False\n )\n```\n\n\nFor the detailed examples in each framework, please have a look at **example_notebook** directory.\n\n## Available Logits Processors\n\n### GenLengthLogitsProcessor\nA logits processor that adjusts the likelihood of the end-of-sequence (EOS) token based on the length of the generated sequence, encouraging or discouraging shorter answers.\n\n### CiteFromPromptLogitsProcessor\nA logits processor which boosts or diminishes the likelihood of tokens present in the prompt (and optionally EOS token) to encourage the model to generate tokens similar to those seen in the prompt or vice versa.\n\n### ForceLastPhraseLogitsProcessor\nA logits processor which forces LLMs to use the given phrase before they finalize their answers. Most common use cases can be providing references, thanking user with context etc.\n\n### MultipleChoiceLogitsProcessor\nA logits processor to answer multiple choice questions with one of the choices. A multiple choice question is like:\n```\nI am getting a lot of calls during the day. What is more important for me to consider when I buy a new phone?\n0. Camera\n1. Screen resolution\n2. Operating System\n3. Battery\n```\nThe goal is to make LLM generate \"3\" as an answer.",
"bugtrack_url": null,
"license": null,
"summary": "A collection of LogitsProcessors to customize and enhance LLM behavior for specific tasks.",
"version": "0.1.0",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "d5175cddf7cbd856b6386fdcf6193d5f5f38a9eb7ad6c1c4d5a69c06d22a2b63",
"md5": "11a6815093b8eae6aefef0e7e22117b3",
"sha256": "ec8eb48ae79e819168cfc97f413d26bd42ccec39394e835c58fff5065adf6b20"
},
"downloads": -1,
"filename": "logits_processor_zoo-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "11a6815093b8eae6aefef0e7e22117b3",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 26049,
"upload_time": "2024-11-17T10:23:52",
"upload_time_iso_8601": "2024-11-17T10:23:52.537945Z",
"url": "https://files.pythonhosted.org/packages/d5/17/5cddf7cbd856b6386fdcf6193d5f5f38a9eb7ad6c1c4d5a69c06d22a2b63/logits_processor_zoo-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "73192037379dda8e2e1cd75255cf30d68f8b7ab09264d9ef032b605ecc44156a",
"md5": "18d87fc5ace121eb1a1cf8a4e6dfb84f",
"sha256": "ab4af322fbb45dc80554bb1a1d6945562a2c386b011ef66f2a4c7ccb3f07c9c3"
},
"downloads": -1,
"filename": "logits_processor_zoo-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "18d87fc5ace121eb1a1cf8a4e6dfb84f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 10191,
"upload_time": "2024-11-17T10:23:54",
"upload_time_iso_8601": "2024-11-17T10:23:54.598963Z",
"url": "https://files.pythonhosted.org/packages/73/19/2037379dda8e2e1cd75255cf30d68f8b7ab09264d9ef032b605ecc44156a/logits_processor_zoo-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-17 10:23:54",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "logits-processor-zoo"
}