instruction-ner


Nameinstruction-ner JSON
Version 0.1.6 PyPI version JSON
download
home_pagehttps://github.com/ovbystrova/InstructionNER
SummaryUnofficial implementation of InstructionNER
upload_time2023-05-07 16:31:09
maintainer
docs_urlNone
authorOlga Bystrova
requires_python
license
keywords python nlp deep learning ner t5
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # InstructionNER: A Multi-Task Instruction-Based Generative Framework for Few-shot NER

[![python 3.8](https://img.shields.io/badge/python-3.8-blue.svg)](https://github.com/ovbystrova/InstructionNER#requirements)
[![license](https://img.shields.io/github/license/ovbystrova/InstructionNER?color=blue)](https://github.com/ovbystrova/InstructionNER/blob/main/LICENSE)
[![pypi version](https://img.shields.io/pypi/v/instruction_ner)](https://pypi.org/project/instruction_ner)
[![pypi downloads](https://img.shields.io/pypi/dm/instruction_ner)](https://pypi.org/project/instruction_ner)


[![tests](https://github.com/ovbystrova/InstructionNER/actions/workflows/tests.yml/badge.svg)](https://github.com/ovbystrova/InstructionNER/actions/workflows/tests.yml)
[![codecov](https://codecov.io/gh/ovbystrova/InstructionNER/branch/main/graph/badge.svg?token=L2OOZKLPJL)](https://codecov.io/gh/ovbystrova/InstructionNER)

Unofficial implementation of [InstructionNER](https://arxiv.org/pdf/2203.03903v1.pdf).

![Screenshot](resources/overall_intro.jpg)

## Requirements
Python >=3.8

## Installation
```shell
pip install instruction-ner
```

(Alternative via requirements)
```shell
pip install -r requirements/requirements.in # for training purposes
pip install -r requirements/requirements_test.in # for tests
pip install -r requirements/requirements_dev.in # for inference only
```

## Data Preparation
In order to make a unified training interface,
you can convert your raw input data (supported dataset formats: **conll**, **spacy**, **mit**)
with the following script:
```
instruction_ner-prepare-data \
--path_to_file 'data/conll2003/train.txt' \
--dataset_type 'conll2003' \
--output_folder 'data/conll2003' \
```

This script converts every dataset to a list of sentences.
Every sentence is like this:
```
{
    "context": "SOCCER - JAPAN GET LUCKY WIN , CHINA IN SURPRISE DEFEAT .",
    "entity_values": {
            "LOC": [
                "JAPAN"
            ],
            "PER": [
                "CHINA"
            ]
        },
    "entity_spans": [
            {
                "start": 9,
                "end": 14,
                "label": "LOC"
            },
            {
                "start": 31,
                "end": 36,
                "label": "PER"
            }
        ]
}
```

## Training
Script for training T5 model:
```
instruction_ner-train \
--path_to_instructions 'instructions.json' \
--path_to_options 'options.json' \
--log_dir 'runs/test_run' \
--eval_every_n_batches 200 \
--pred_every_n_batches 200 \
--path_to_model_config 'config.yaml' \
--path_to_model_save 'runs/model/' \
```

Arguments:
- **--path_to_instructions** - file with instruction prompts
- **--path_to_options** - file with mapping dataset to its entities
- **--log_dir** - where to log tensorboard
- **--eval_every_n_batches** - do evaluation every n batches
- **--pred_every_n_batches** - write n sample prediction every n batches
- **--path_to_model_config** - path to all necessary information for model
- **--path_to_model_save** - where to save model

## Evaluation
Script for evaluation of the trained model:
```
instruction_ner-evaluate \
--model_path_or_name 'olgaduchovny/t5-base-qa-ner-conll' \
--path_to_model_config 'config.yaml' \
--path_to_instructions 'instructions.json' \
--path_to_options 'options.json' \
```

Arguments:
- **--model_path_or_name** - path to trained model or HF model name
- **--path_to_model_config** - path to all necessary information for model
- **--path_to_instructions** - file with instruction prompts
- **--path_to_options** - file with mapping dataset to its entities

## Evaluation Results



Dataset | Precision | Recall | F1-Score (weighted)
--- | --- | --- | --- |
CONLL-2003 | 0.862 | 0.843 | 0.852
MIT MOVIE | 0.792 | 0.845 | 0.809 |
MIT REST | 0.766 | 0.771 | 0.768 |

## Prediction Sample
```
Sentence: The protest , which attracted several thousand supporters , coincided with the 18th anniversary of Spain 's constitution .
Instruction: please extract entities and their types from the input sentence, all entity types are in options
Options: ORG, PER, LOC

Prediction (raw text): Spain is a LOC.
```
## Inference

### Models
[t5-base-ner-conll](https://huggingface.co/olgaduchovny/t5-base-ner-conll)

[t5-base-ner-mit-restaurant](https://huggingface.co/olgaduchovny/t5-base-ner-mit-restaurant)

[t5-base-ner-mit-movie](https://huggingface.co/olgaduchovny/t5-base-ner-mit-movie)

### Code
```python
from instruction_ner.model import Model

model = Model(
    model_path_or_name="olgaduchovny/t5-base-ner-conll",
    tokenizer_path_or_name="olgaduchovny/t5-base-ner-conll"
)

options = ["LOC", "PER", "ORG", "MISC"]

instruction = "please extract entities and their types from the input sentence, " \
              "all entity types are in options"

text = "My name is Olga. I am 24 years old. I live in Moscow and work at Sber AI Center as a Senior NLP Data Scientist." \
        "This is my reporitory to test generative NER problem with T5 model."

generation_kwargs = {
    "num_beams": 2,
    "max_length": 128
}

pred_text, pred_spans = model.predict(
    text=text,
    generation_kwargs=generation_kwargs,
    instruction=instruction,
    options=options
)

>>> ('Olga is a PER, Moscow is a LOC, Sber AI Center is an ORG, NLP is a MISC.',
 [(11, 15, 'PER'), (46, 52, 'LOC'), (65, 79, 'ORG'), (92, 95, 'MISC')])
```



# Citation
```bibtex
@article{wang2022instructionner,
  title={Instructionner: A multi-task instruction-based generative framework for few-shot ner},
  author={Wang, Liwen and Li, Rumei and Yan, Yang and Yan, Yuanmeng and Wang, Sirui and Wu, Wei and Xu, Weiran},
  journal={arXiv preprint arXiv:2203.03903},
  year={2022}
}
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/ovbystrova/InstructionNER",
    "name": "instruction-ner",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "python,nlp,deep learning,ner,t5",
    "author": "Olga Bystrova",
    "author_email": "bystrovaolgavl@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/ea/ba/9622e925618d2037bb8f7cff2333fd53495edeafb80e287b9a0737a8653b/instruction_ner-0.1.6.tar.gz",
    "platform": null,
    "description": "# InstructionNER: A Multi-Task Instruction-Based Generative Framework for Few-shot NER\n\n[![python 3.8](https://img.shields.io/badge/python-3.8-blue.svg)](https://github.com/ovbystrova/InstructionNER#requirements)\n[![license](https://img.shields.io/github/license/ovbystrova/InstructionNER?color=blue)](https://github.com/ovbystrova/InstructionNER/blob/main/LICENSE)\n[![pypi version](https://img.shields.io/pypi/v/instruction_ner)](https://pypi.org/project/instruction_ner)\n[![pypi downloads](https://img.shields.io/pypi/dm/instruction_ner)](https://pypi.org/project/instruction_ner)\n\n\n[![tests](https://github.com/ovbystrova/InstructionNER/actions/workflows/tests.yml/badge.svg)](https://github.com/ovbystrova/InstructionNER/actions/workflows/tests.yml)\n[![codecov](https://codecov.io/gh/ovbystrova/InstructionNER/branch/main/graph/badge.svg?token=L2OOZKLPJL)](https://codecov.io/gh/ovbystrova/InstructionNER)\n\nUnofficial implementation of [InstructionNER](https://arxiv.org/pdf/2203.03903v1.pdf).\n\n![Screenshot](resources/overall_intro.jpg)\n\n## Requirements\nPython >=3.8\n\n## Installation\n```shell\npip install instruction-ner\n```\n\n(Alternative via requirements)\n```shell\npip install -r requirements/requirements.in # for training purposes\npip install -r requirements/requirements_test.in # for tests\npip install -r requirements/requirements_dev.in # for inference only\n```\n\n## Data Preparation\nIn order to make a unified training interface,\nyou can convert your raw input data (supported dataset formats: **conll**, **spacy**, **mit**)\nwith the following script:\n```\ninstruction_ner-prepare-data \\\n--path_to_file 'data/conll2003/train.txt' \\\n--dataset_type 'conll2003' \\\n--output_folder 'data/conll2003' \\\n```\n\nThis script converts every dataset to a list of sentences.\nEvery sentence is like this:\n```\n{\n    \"context\": \"SOCCER - JAPAN GET LUCKY WIN , CHINA IN SURPRISE DEFEAT .\",\n    \"entity_values\": {\n            \"LOC\": [\n                \"JAPAN\"\n            ],\n            \"PER\": [\n                \"CHINA\"\n            ]\n        },\n    \"entity_spans\": [\n            {\n                \"start\": 9,\n                \"end\": 14,\n                \"label\": \"LOC\"\n            },\n            {\n                \"start\": 31,\n                \"end\": 36,\n                \"label\": \"PER\"\n            }\n        ]\n}\n```\n\n## Training\nScript for training T5 model:\n```\ninstruction_ner-train \\\n--path_to_instructions 'instructions.json' \\\n--path_to_options 'options.json' \\\n--log_dir 'runs/test_run' \\\n--eval_every_n_batches 200 \\\n--pred_every_n_batches 200 \\\n--path_to_model_config 'config.yaml' \\\n--path_to_model_save 'runs/model/' \\\n```\n\nArguments:\n- **--path_to_instructions** - file with instruction prompts\n- **--path_to_options** - file with mapping dataset to its entities\n- **--log_dir** - where to log tensorboard\n- **--eval_every_n_batches** - do evaluation every n batches\n- **--pred_every_n_batches** - write n sample prediction every n batches\n- **--path_to_model_config** - path to all necessary information for model\n- **--path_to_model_save** - where to save model\n\n## Evaluation\nScript for evaluation of the trained model:\n```\ninstruction_ner-evaluate \\\n--model_path_or_name 'olgaduchovny/t5-base-qa-ner-conll' \\\n--path_to_model_config 'config.yaml' \\\n--path_to_instructions 'instructions.json' \\\n--path_to_options 'options.json' \\\n```\n\nArguments:\n- **--model_path_or_name** - path to trained model or HF model name\n- **--path_to_model_config** - path to all necessary information for model\n- **--path_to_instructions** - file with instruction prompts\n- **--path_to_options** - file with mapping dataset to its entities\n\n## Evaluation Results\n\n\n\nDataset | Precision | Recall | F1-Score (weighted)\n--- | --- | --- | --- |\nCONLL-2003 | 0.862 | 0.843 | 0.852\nMIT MOVIE | 0.792 | 0.845 | 0.809 |\nMIT REST | 0.766 | 0.771 | 0.768 |\n\n## Prediction Sample\n```\nSentence: The protest , which attracted several thousand supporters , coincided with the 18th anniversary of Spain 's constitution .\nInstruction: please extract entities and their types from the input sentence, all entity types are in options\nOptions: ORG, PER, LOC\n\nPrediction (raw text): Spain is a LOC.\n```\n## Inference\n\n### Models\n[t5-base-ner-conll](https://huggingface.co/olgaduchovny/t5-base-ner-conll)\n\n[t5-base-ner-mit-restaurant](https://huggingface.co/olgaduchovny/t5-base-ner-mit-restaurant)\n\n[t5-base-ner-mit-movie](https://huggingface.co/olgaduchovny/t5-base-ner-mit-movie)\n\n### Code\n```python\nfrom instruction_ner.model import Model\n\nmodel = Model(\n    model_path_or_name=\"olgaduchovny/t5-base-ner-conll\",\n    tokenizer_path_or_name=\"olgaduchovny/t5-base-ner-conll\"\n)\n\noptions = [\"LOC\", \"PER\", \"ORG\", \"MISC\"]\n\ninstruction = \"please extract entities and their types from the input sentence, \" \\\n              \"all entity types are in options\"\n\ntext = \"My name is Olga. I am 24 years old. I live in Moscow and work at Sber AI Center as a Senior NLP Data Scientist.\" \\\n        \"This is my reporitory to test generative NER problem with T5 model.\"\n\ngeneration_kwargs = {\n    \"num_beams\": 2,\n    \"max_length\": 128\n}\n\npred_text, pred_spans = model.predict(\n    text=text,\n    generation_kwargs=generation_kwargs,\n    instruction=instruction,\n    options=options\n)\n\n>>> ('Olga is a PER, Moscow is a LOC, Sber AI Center is an ORG, NLP is a MISC.',\n [(11, 15, 'PER'), (46, 52, 'LOC'), (65, 79, 'ORG'), (92, 95, 'MISC')])\n```\n\n\n\n# Citation\n```bibtex\n@article{wang2022instructionner,\n  title={Instructionner: A multi-task instruction-based generative framework for few-shot ner},\n  author={Wang, Liwen and Li, Rumei and Yan, Yang and Yan, Yuanmeng and Wang, Sirui and Wu, Wei and Xu, Weiran},\n  journal={arXiv preprint arXiv:2203.03903},\n  year={2022}\n}\n```\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Unofficial implementation of InstructionNER",
    "version": "0.1.6",
    "project_urls": {
        "Homepage": "https://github.com/ovbystrova/InstructionNER"
    },
    "split_keywords": [
        "python",
        "nlp",
        "deep learning",
        "ner",
        "t5"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1113cb4ef6217edaae0c0a7913569d0c0702b7971af4c687fb991d3dffe15ff6",
                "md5": "ef461e6a8adcde5fba4bd8f17bea6929",
                "sha256": "fb966a762d480cafc82a3ca9078973a2ad0fa84bd996334b3c7b1403bcaf1500"
            },
            "downloads": -1,
            "filename": "instruction_ner-0.1.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ef461e6a8adcde5fba4bd8f17bea6929",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 30932,
            "upload_time": "2023-05-07T16:31:08",
            "upload_time_iso_8601": "2023-05-07T16:31:08.062885Z",
            "url": "https://files.pythonhosted.org/packages/11/13/cb4ef6217edaae0c0a7913569d0c0702b7971af4c687fb991d3dffe15ff6/instruction_ner-0.1.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "eaba9622e925618d2037bb8f7cff2333fd53495edeafb80e287b9a0737a8653b",
                "md5": "b8f327afa1128b097e739d3394a0775f",
                "sha256": "a1f0736605e9e1c322a6f5a23b5222052a1aea48f6f8f21bab4e7458cc30409d"
            },
            "downloads": -1,
            "filename": "instruction_ner-0.1.6.tar.gz",
            "has_sig": false,
            "md5_digest": "b8f327afa1128b097e739d3394a0775f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 25903,
            "upload_time": "2023-05-07T16:31:09",
            "upload_time_iso_8601": "2023-05-07T16:31:09.539116Z",
            "url": "https://files.pythonhosted.org/packages/ea/ba/9622e925618d2037bb8f7cff2333fd53495edeafb80e287b9a0737a8653b/instruction_ner-0.1.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-05-07 16:31:09",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ovbystrova",
    "github_project": "InstructionNER",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "instruction-ner"
}
        
Elapsed time: 0.83087s