adaseq


Nameadaseq JSON
Version 0.6.6 PyPI version JSON
download
home_pagehttps://github.com/modelscope/adaseq
SummaryAdaSeq: An All-in-One Library for Developing State-of-the-Art Sequence Understanding Models
upload_time2023-11-15 12:07:01
maintainer
docs_urlNone
authorAlibaba Damo Academy NLP foundation team
requires_python>=3.7.0
licenseApache License 2.0
keywords
VCS
bugtrack_url
requirements addict datasets modelscope seqeval torch tqdm transformers urllib3
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # AdaSeq: An All-in-One Library for Developing State-of-the-Art Sequence Understanding Models

<div align="center">

[![license](https://img.shields.io/github/license/modelscope/adaseq.svg)](./LICENSE)
[![modelscope](https://img.shields.io/badge/modelscope->=1.4.0-624aff.svg)](https://modelscope.cn/)
![version](https://img.shields.io/github/tag/modelscope/adaseq.svg)
[![issues](https://img.shields.io/github/issues/modelscope/adaseq.svg)](https://github.com/modelscope/AdaSeq/issues)
[![stars](https://img.shields.io/github/stars/modelscope/adaseq.svg)](https://github.com/modelscope/AdaSeq/stargazers)
[![downloads](https://static.pepy.tech/personalized-badge/adaseq?period=total&left_color=grey&right_color=yellowgreen&left_text=downloads)](https://pypi.org/project/adaseq)
[![contribution](https://img.shields.io/badge/contributions-welcome-brightgreen.svg)](./CONTRIBUTING.md)

</div>

<div align="center">

English | [简体中文](./README_zh.md)

</div>

## Introduction
***AdaSeq*** (**A**libaba **D**amo **A**cademy **Seq**uence Understanding Toolkit) is an easy-to-use all-in-one library, built on [ModelScope](https://modelscope.cn/home), that allows researchers and developers to train custom models for sequence understanding tasks, including part-of-speech tagging (POS Tagging), chunking, named entity recognition (NER), entity typing, relation extraction (RE), etc.

![](./docs/imgs/task_examples_en.png)

<details open>
<summary>🌟 <b>Features:</b></summary>

- **Plentiful Models**:

  AdaSeq provide plenty of cutting-edge models, training methods and useful toolkits for sequence understanding tasks.

- **State-of-the-Art**:

  Our aim to develop the best implementation, which can beat many off-the-shelf frameworks on performance.

- **Easy-to-Use**:

  One line of command is all you need to obtain the best model.

- **Extensible**:

  It's easy to register a module, or build a customized sequence understanding model by assembling the predefined modules.

</details>

⚠️**Notice:** This project is under quick development. This means some interfaces could be changed in the future.

## 📢 What's New
- 2022-07: [SemEval 2023] Our U-RaNER paper won [Best Paper Award](https://semeval.github.io/SemEval2023/awards)!
- 2022-03: [SemEval 2023] Our U-RaNER won ***1st place in 9 tracks*** at [SemEval 2023 Task2](https://multiconer.github.io/results): Multilingual Complex Named Entity Recognition! [Model introduction and source code can be found here](./examples/U-RaNER).
- 2022-12: [[EMNLP 2022] Retrieval-augmented Multimodal Entity Understanding Model (MoRe)](./examples/MoRe)
- 2022-11: [[EMNLP 2022] Ultra-Fine Entity Typing Model (NPCRF)](./examples/NPCRF)
- 2022-11: [[EMNLP 2022] Unsupervised Boundary-Aware Language Model (BABERT)](./examples/babert)

## ⚡ Quick Experience
You can try out our models via online demos built on ModelScope:
[[English NER]](https://modelscope.cn/models/damo/nlp_raner_named-entity-recognition_english-large-news/summary)
[[Chinese NER]](https://modelscope.cn/models/damo/nlp_raner_named-entity-recognition_chinese-base-news/summary)
[[CWS]](https://modelscope.cn/models/damo/nlp_structbert_word-segmentation_chinese-base/summary)

More tasks, more languages, more domains: All modelcards we released can be found in this page [Modelcards](./docs/modelcards.md).

## 🛠️ Model Zoo
<details open>
<summary><b>Supported models:</b></summary>

- [Transformer-based CRF](./examples/bert_crf)
- [Partial CRF](./examples/partial_bert_crf)
- [Retrieval Augmented NER](./examples/RaNER)
- [Biaffine NER](./examples/biaffine_ner)
- [Global-Pointer](./examples/global_pointer)
- [Multi-label Entity Typing](./examples/entity_typing)
- ...
</details>

## 💾 Dataset Zoo
We collected many datasets for sequence understanding tasks. All can be found in this page [Datasets](./docs/datasets.md).

## 📦 Installation
AdaSeq project is based on `Python >= 3.7`, `PyTorch >= 1.8` and `ModelScope >= 1.4`. We assure that AdaSeq can run smoothly when `ModelScope == 1.9.5`.

- installation via pip:
```
pip install adaseq
```

- installation from source:
```
git clone https://github.com/modelscope/adaseq.git
cd adaseq
pip install -r requirements.txt -f https://modelscope.oss-cn-beijing.aliyuncs.com/releases/repo.html
```

### Verify the Installation
To verify whether AdaSeq is installed properly, we provide a demo config for training a model (the demo config will be automatically downloaded).
```
adaseq train -c demo.yaml
```
You will see the training logs on your terminal. Once the training is done, the results on test set will be printed: `test: {"precision": xxx, "recall": xxx, "f1": xxx}`. A folder `experiments/toy_msra/` will be generated to save all experimental results and model checkpoints.

## 📖 Tutorials
- [Quick Start](./docs/tutorials/quick_start.md)
- Basics
  - [Learning about Configs](./docs/tutorials/learning_about_configs.md)
  - [Customizing Dataset](./docs/tutorials/customizing_dataset.md)
  - [TODO] Common Architectures
  - [TODO] Useful Hooks
  - [Hyperparameter Optimization](./docs/tutorials/hyperparameter_optimization.md)
  - [Training with Multiple GPUs](./docs/tutorials/training_with_multiple_gpus.md)
- Best Practice
  - [Training a Model with Custom Dataset](./docs/tutorials/training_a_model.md)
  - [Reproducing Results in Published Papers](./docs/tutorials/reproducing_papers.md)
  - [TODO] Uploading Saved Model to ModelScope
  - [TODO] Customizing your Model
  - [TODO] Serving with AdaLA

## 📝 Contributing
All contributions are welcome to improve AdaSeq. Please refer to [CONTRIBUTING.md](./CONTRIBUTING.md) for the contributing guideline.

## 📄 License
This project is licensed under the Apache License (Version 2.0).



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/modelscope/adaseq",
    "name": "adaseq",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7.0",
    "maintainer_email": "",
    "keywords": "",
    "author": "Alibaba Damo Academy NLP foundation team",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/20/04/00c44c007d14027a586954e016dc340b3315b726cf14906c494a1a94cb97/adaseq-0.6.6.tar.gz",
    "platform": "any",
    "description": "# AdaSeq: An All-in-One Library for Developing State-of-the-Art Sequence Understanding Models\n\n<div align=\"center\">\n\n[![license](https://img.shields.io/github/license/modelscope/adaseq.svg)](./LICENSE)\n[![modelscope](https://img.shields.io/badge/modelscope->=1.4.0-624aff.svg)](https://modelscope.cn/)\n![version](https://img.shields.io/github/tag/modelscope/adaseq.svg)\n[![issues](https://img.shields.io/github/issues/modelscope/adaseq.svg)](https://github.com/modelscope/AdaSeq/issues)\n[![stars](https://img.shields.io/github/stars/modelscope/adaseq.svg)](https://github.com/modelscope/AdaSeq/stargazers)\n[![downloads](https://static.pepy.tech/personalized-badge/adaseq?period=total&left_color=grey&right_color=yellowgreen&left_text=downloads)](https://pypi.org/project/adaseq)\n[![contribution](https://img.shields.io/badge/contributions-welcome-brightgreen.svg)](./CONTRIBUTING.md)\n\n</div>\n\n<div align=\"center\">\n\nEnglish | [\u7b80\u4f53\u4e2d\u6587](./README_zh.md)\n\n</div>\n\n## Introduction\n***AdaSeq*** (**A**libaba **D**amo **A**cademy **Seq**uence Understanding Toolkit) is an easy-to-use all-in-one library, built on [ModelScope](https://modelscope.cn/home), that allows researchers and developers to train custom models for sequence understanding tasks, including part-of-speech tagging (POS Tagging), chunking, named entity recognition (NER), entity typing, relation extraction (RE), etc.\n\n![](./docs/imgs/task_examples_en.png)\n\n<details open>\n<summary>\ud83c\udf1f <b>Features:</b></summary>\n\n- **Plentiful Models**:\n\n  AdaSeq provide plenty of cutting-edge models, training methods and useful toolkits for sequence understanding tasks.\n\n- **State-of-the-Art**:\n\n  Our aim to develop the best implementation, which can beat many off-the-shelf frameworks on performance.\n\n- **Easy-to-Use**:\n\n  One line of command is all you need to obtain the best model.\n\n- **Extensible**:\n\n  It's easy to register a module, or build a customized sequence understanding model by assembling the predefined modules.\n\n</details>\n\n\u26a0\ufe0f**Notice:** This project is under quick development. This means some interfaces could be changed in the future.\n\n## \ud83d\udce2 What's New\n- 2022-07: [SemEval 2023] Our U-RaNER paper won [Best Paper Award](https://semeval.github.io/SemEval2023/awards)!\n- 2022-03: [SemEval 2023] Our U-RaNER won ***1st place in 9 tracks*** at [SemEval 2023 Task2](https://multiconer.github.io/results): Multilingual Complex Named Entity Recognition! [Model introduction and source code can be found here](./examples/U-RaNER).\n- 2022-12: [[EMNLP 2022] Retrieval-augmented Multimodal Entity Understanding Model (MoRe)](./examples/MoRe)\n- 2022-11: [[EMNLP 2022] Ultra-Fine Entity Typing Model (NPCRF)](./examples/NPCRF)\n- 2022-11: [[EMNLP 2022] Unsupervised Boundary-Aware Language Model (BABERT)](./examples/babert)\n\n## \u26a1 Quick Experience\nYou can try out our models via online demos built on ModelScope:\n[[English NER]](https://modelscope.cn/models/damo/nlp_raner_named-entity-recognition_english-large-news/summary)\n[[Chinese NER]](https://modelscope.cn/models/damo/nlp_raner_named-entity-recognition_chinese-base-news/summary)\n[[CWS]](https://modelscope.cn/models/damo/nlp_structbert_word-segmentation_chinese-base/summary)\n\nMore tasks, more languages, more domains: All modelcards we released can be found in this page [Modelcards](./docs/modelcards.md).\n\n## \ud83d\udee0\ufe0f Model Zoo\n<details open>\n<summary><b>Supported models:</b></summary>\n\n- [Transformer-based CRF](./examples/bert_crf)\n- [Partial CRF](./examples/partial_bert_crf)\n- [Retrieval Augmented NER](./examples/RaNER)\n- [Biaffine NER](./examples/biaffine_ner)\n- [Global-Pointer](./examples/global_pointer)\n- [Multi-label Entity Typing](./examples/entity_typing)\n- ...\n</details>\n\n## \ud83d\udcbe Dataset Zoo\nWe collected many datasets for sequence understanding tasks. All can be found in this page [Datasets](./docs/datasets.md).\n\n## \ud83d\udce6 Installation\nAdaSeq project is based on `Python >= 3.7`, `PyTorch >= 1.8` and `ModelScope >= 1.4`. We assure that AdaSeq can run smoothly when `ModelScope == 1.9.5`.\n\n- installation via pip\uff1a\n```\npip install adaseq\n```\n\n- installation from source\uff1a\n```\ngit clone https://github.com/modelscope/adaseq.git\ncd adaseq\npip install -r requirements.txt -f https://modelscope.oss-cn-beijing.aliyuncs.com/releases/repo.html\n```\n\n### Verify the Installation\nTo verify whether AdaSeq is installed properly, we provide a demo config for training a model (the demo config will be automatically downloaded).\n```\nadaseq train -c demo.yaml\n```\nYou will see the training logs on your terminal. Once the training is done, the results on test set will be printed: `test: {\"precision\": xxx, \"recall\": xxx, \"f1\": xxx}`. A folder `experiments/toy_msra/` will be generated to save all experimental results and model checkpoints.\n\n## \ud83d\udcd6 Tutorials\n- [Quick Start](./docs/tutorials/quick_start.md)\n- Basics\n  - [Learning about Configs](./docs/tutorials/learning_about_configs.md)\n  - [Customizing Dataset](./docs/tutorials/customizing_dataset.md)\n  - [TODO] Common Architectures\n  - [TODO] Useful Hooks\n  - [Hyperparameter Optimization](./docs/tutorials/hyperparameter_optimization.md)\n  - [Training with Multiple GPUs](./docs/tutorials/training_with_multiple_gpus.md)\n- Best Practice\n  - [Training a Model with Custom Dataset](./docs/tutorials/training_a_model.md)\n  - [Reproducing Results in Published Papers](./docs/tutorials/reproducing_papers.md)\n  - [TODO] Uploading Saved Model to ModelScope\n  - [TODO] Customizing your Model\n  - [TODO] Serving with AdaLA\n\n## \ud83d\udcdd Contributing\nAll contributions are welcome to improve AdaSeq. Please refer to [CONTRIBUTING.md](./CONTRIBUTING.md) for the contributing guideline.\n\n## \ud83d\udcc4 License\nThis project is licensed under the Apache License (Version 2.0).\n\n\n",
    "bugtrack_url": null,
    "license": "Apache License 2.0",
    "summary": "AdaSeq: An All-in-One Library for Developing State-of-the-Art Sequence Understanding Models",
    "version": "0.6.6",
    "project_urls": {
        "Homepage": "https://github.com/modelscope/adaseq"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4947ddf684253dbb4c3e0716fcda67094aa3c407237d5eb8930ede0a91b9feb8",
                "md5": "3590769c827849e06546b94707cba15d",
                "sha256": "22b692b89bcee6e7ec77e6fe36501732036fe4cc0db256a6cb94648c0b3251b3"
            },
            "downloads": -1,
            "filename": "adaseq-0.6.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3590769c827849e06546b94707cba15d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7.0",
            "size": 149508,
            "upload_time": "2023-11-15T12:06:59",
            "upload_time_iso_8601": "2023-11-15T12:06:59.032527Z",
            "url": "https://files.pythonhosted.org/packages/49/47/ddf684253dbb4c3e0716fcda67094aa3c407237d5eb8930ede0a91b9feb8/adaseq-0.6.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "200400c44c007d14027a586954e016dc340b3315b726cf14906c494a1a94cb97",
                "md5": "9ecbbf474188029060fd3da25e09be58",
                "sha256": "395c66a77e3e2128b79811f9845416f64f619649601c5a1f94e37e6ca590212d"
            },
            "downloads": -1,
            "filename": "adaseq-0.6.6.tar.gz",
            "has_sig": false,
            "md5_digest": "9ecbbf474188029060fd3da25e09be58",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7.0",
            "size": 97557,
            "upload_time": "2023-11-15T12:07:01",
            "upload_time_iso_8601": "2023-11-15T12:07:01.943186Z",
            "url": "https://files.pythonhosted.org/packages/20/04/00c44c007d14027a586954e016dc340b3315b726cf14906c494a1a94cb97/adaseq-0.6.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-11-15 12:07:01",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "modelscope",
    "github_project": "adaseq",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "addict",
            "specs": []
        },
        {
            "name": "datasets",
            "specs": []
        },
        {
            "name": "modelscope",
            "specs": [
                [
                    ">=",
                    "1.4.0"
                ]
            ]
        },
        {
            "name": "seqeval",
            "specs": []
        },
        {
            "name": "torch",
            "specs": [
                [
                    ">=",
                    "1.11.0"
                ]
            ]
        },
        {
            "name": "tqdm",
            "specs": [
                [
                    ">=",
                    "4.64.0"
                ]
            ]
        },
        {
            "name": "transformers",
            "specs": [
                [
                    ">=",
                    "4.21.0"
                ]
            ]
        },
        {
            "name": "urllib3",
            "specs": [
                [
                    ">=",
                    "1.26.0"
                ]
            ]
        }
    ],
    "lcname": "adaseq"
}
        
Elapsed time: 0.15031s