<div align="center">
<img src="https://s1.ax1x.com/2023/03/07/ppZfEmq.png" border="0" width=600px/>
</div>
------
<p align="center">
<a href="#overview">Overview</a> •
<a href="#installation">Installation</a> •
<a href="https://arxiv.org/abs/2303.02913">Paper</a> •
<a href="https://github.com/Shark-NLP/OpenICL/tree/main/examples">Examples</a> •
<a href="https://openicl.readthedocs.io/en/latest/index.html">Docs</a> •
<a href="#citation">Citation</a>
</p>
![version](https://img.shields.io/badge/version-0.1.8-blue)
## Overview
OpenICL provides an easy interface for in-context learning, with many state-of-the-art retrieval and inference methods built in to facilitate systematic comparison of LMs and fast research prototyping. Users can easily incorporate different retrieval and inference methods, as well as different prompt instructions into their workflow.
<div align="center">
<img src="https://s1.ax1x.com/2023/03/07/ppZWjmt.jpg" border="0" />
</div>
## What's News
+ **v0.1.8** Support LLaMA and self-consistency
## Installation
Note: OpenICL requires Python 3.8+
**Using Pip**
```
pip install openicl
```
**Installation for local development:**
```
git clone https://github.com/Shark-NLP/OpenICL
cd OpenICL
pip install -e .
```
## Quick Start
Following example shows you how to perform ICL on sentiment classification dataset. More examples and tutorials can be found at [examples](https://github.com/Shark-NLP/OpenICL/tree/main/examples)
#### Step 1: Load and prepare data
```python
from datasets import load_dataset
from openicl import DatasetReader
# Loading dataset from huggingface
dataset = load_dataset('gpt3mix/sst2')
# Define a DatasetReader, with specified column names where input and output are stored.
data = DatasetReader(dataset, input_columns=['text'], output_column='label')
```
#### Step 2: Define the prompt template (Optional)
```python
from openicl import PromptTemplate
tp_dict = {
0: "</E>Positive Movie Review: </text>",
1: "</E>Negative Movie Review: </text>"
}
template = PromptTemplate(tp_dict, {'text': '</text>'}, ice_token='</E>')
```
The placeholder `</E>` and `</text>` will be replaced by in-context examples and testing input, respectively. For more detailed information about `PromptTemplate` (such as string-type template) , please see [tutorial1](https://github.com/Shark-NLP/OpenICL/blob/main/examples/tutorials/openicl_tutorial1_getting_started.ipynb).
#### Step 3: Initialize the Retriever
```python
from openicl import TopkRetriever
# Define a retriever using the previous `DataLoader`.
# `ice_num` stands for the number of data in in-context examples.
retriever = TopkRetriever(data, ice_num=8)
```
Here we use the popular <a href="https://arxiv.org/abs/2101.06804">TopK</a> method to build the retriever.
#### Step 4: Initialize the Inferencer
```python
from openicl import PPLInferencer
inferencer = PPLInferencer(model_name='distilgpt2')
```
#### Step 5: Inference and scoring
```python
from openicl import AccEvaluator
# the inferencer requires retriever to collect in-context examples, as well as a template to wrap up these examples.
predictions = inferencer.inference(retriever, ice_template=template)
# compute accuracy for the prediction
score = AccEvaluator().score(predictions=predictions, references=data.references)
print(score)
```
## Docs
**(updating...)**
[OpenICL Documentation](https://openicl.readthedocs.io/en/latest/index.html)
## Citation
If you find this repository helpful, feel free to cite our paper:
```bibtex
@article{wu2023openicl,
title={OpenICL: An Open-Source Framework for In-context Learning},
author={Zhenyu Wu, Yaoxiang Wang, Jiacheng Ye, Jiangtao Feng, Jingjing Xu, Yu Qiao, Zhiyong Wu},
journal={arXiv preprint arXiv:2303.02913},
year={2023}
}
```
Raw data
{
"_id": null,
"home_page": "https://github.com/Shark-NLP/OpenICL",
"name": "openicl-unlabel",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8.0",
"maintainer_email": null,
"keywords": "AI, NLP, in-context learning",
"author": "Yunpeng Tai",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/db/0a/abd5a7fa58246d0ac78988732dc454d1eb5d94599b49509646a27f002c6a/openicl-unlabel-0.0.1.tar.gz",
"platform": null,
"description": "<div align=\"center\">\n<img src=\"https://s1.ax1x.com/2023/03/07/ppZfEmq.png\" border=\"0\" width=600px/>\n</div>\n\n------\n\n<p align=\"center\">\n <a href=\"#overview\">Overview</a> \u2022\n <a href=\"#installation\">Installation</a> \u2022\n <a href=\"https://arxiv.org/abs/2303.02913\">Paper</a> \u2022\n <a href=\"https://github.com/Shark-NLP/OpenICL/tree/main/examples\">Examples</a> \u2022\n <a href=\"https://openicl.readthedocs.io/en/latest/index.html\">Docs</a> \u2022\n <a href=\"#citation\">Citation</a> \n</p>\n\n![version](https://img.shields.io/badge/version-0.1.8-blue)\n\n\n## Overview\nOpenICL provides an easy interface for in-context learning, with many state-of-the-art retrieval and inference methods built in to facilitate systematic comparison of LMs and fast research prototyping. Users can easily incorporate different retrieval and inference methods, as well as different prompt instructions into their workflow. \n<div align=\"center\">\n<img src=\"https://s1.ax1x.com/2023/03/07/ppZWjmt.jpg\" border=\"0\" />\n</div>\n\n## What's News\n+ **v0.1.8** Support LLaMA and self-consistency\n\n## Installation\nNote: OpenICL requires Python 3.8+\n\n**Using Pip**\n```\npip install openicl\n```\n\n\n**Installation for local development:**\n```\ngit clone https://github.com/Shark-NLP/OpenICL\ncd OpenICL\npip install -e .\n```\n\n## Quick Start\nFollowing example shows you how to perform ICL on sentiment classification dataset. More examples and tutorials can be found at [examples](https://github.com/Shark-NLP/OpenICL/tree/main/examples)\n\n#### Step 1: Load and prepare data\n```python\nfrom datasets import load_dataset\nfrom openicl import DatasetReader\n\n# Loading dataset from huggingface\ndataset = load_dataset('gpt3mix/sst2')\n\n# Define a DatasetReader, with specified column names where input and output are stored.\ndata = DatasetReader(dataset, input_columns=['text'], output_column='label')\n```\n\n#### Step 2: Define the prompt template (Optional)\n```python\nfrom openicl import PromptTemplate\ntp_dict = {\n 0: \"</E>Positive Movie Review: </text>\",\n 1: \"</E>Negative Movie Review: </text>\" \n}\n\ntemplate = PromptTemplate(tp_dict, {'text': '</text>'}, ice_token='</E>')\n```\nThe placeholder `</E>` and `</text>` will be replaced by in-context examples and testing input, respectively. For more detailed information about `PromptTemplate` (such as string-type template) , please see [tutorial1](https://github.com/Shark-NLP/OpenICL/blob/main/examples/tutorials/openicl_tutorial1_getting_started.ipynb).\n\n#### Step 3: Initialize the Retriever\n```python\nfrom openicl import TopkRetriever\n# Define a retriever using the previous `DataLoader`.\n# `ice_num` stands for the number of data in in-context examples.\nretriever = TopkRetriever(data, ice_num=8)\n```\nHere we use the popular <a href=\"https://arxiv.org/abs/2101.06804\">TopK</a> method to build the retriever. \n\n#### Step 4: Initialize the Inferencer \n```python\nfrom openicl import PPLInferencer\ninferencer = PPLInferencer(model_name='distilgpt2')\n```\n\n#### Step 5: Inference and scoring\n```python\nfrom openicl import AccEvaluator\n# the inferencer requires retriever to collect in-context examples, as well as a template to wrap up these examples.\npredictions = inferencer.inference(retriever, ice_template=template)\n# compute accuracy for the prediction\nscore = AccEvaluator().score(predictions=predictions, references=data.references)\nprint(score)\n```\n\n\n\n## Docs\n**(updating...)**\n\n[OpenICL Documentation](https://openicl.readthedocs.io/en/latest/index.html)\n\n## Citation\nIf you find this repository helpful, feel free to cite our paper:\n```bibtex\n@article{wu2023openicl,\n title={OpenICL: An Open-Source Framework for In-context Learning},\n author={Zhenyu Wu, Yaoxiang Wang, Jiacheng Ye, Jiangtao Feng, Jingjing Xu, Yu Qiao, Zhiyong Wu},\n journal={arXiv preprint arXiv:2303.02913},\n year={2023}\n}\n```\n",
"bugtrack_url": null,
"license": null,
"summary": "This is a modified version based on the original openicl 0.1.8 for certain research usages.",
"version": "0.0.1",
"project_urls": {
"Homepage": "https://github.com/Shark-NLP/OpenICL"
},
"split_keywords": [
"ai",
" nlp",
" in-context learning"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "df91ab1bf60c36721e2f9a9503b37eb555bb49aec10cb6a098de11d579c5d819",
"md5": "ff4c3a59d2875bd93374b1cd57fb8fb4",
"sha256": "d38f6f20afdbe7585fae16953e061c4994287303bd2e4ffb32f5728287e50033"
},
"downloads": -1,
"filename": "openicl_unlabel-0.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ff4c3a59d2875bd93374b1cd57fb8fb4",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8.0",
"size": 37151,
"upload_time": "2024-09-26T02:36:57",
"upload_time_iso_8601": "2024-09-26T02:36:57.055590Z",
"url": "https://files.pythonhosted.org/packages/df/91/ab1bf60c36721e2f9a9503b37eb555bb49aec10cb6a098de11d579c5d819/openicl_unlabel-0.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "db0aabd5a7fa58246d0ac78988732dc454d1eb5d94599b49509646a27f002c6a",
"md5": "ed599a51eb870902a29b27bf8958705d",
"sha256": "888282c4a6b018cace433e315f090f83b5ca442024b9ccb7172a5c1138070f88"
},
"downloads": -1,
"filename": "openicl-unlabel-0.0.1.tar.gz",
"has_sig": false,
"md5_digest": "ed599a51eb870902a29b27bf8958705d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8.0",
"size": 22831,
"upload_time": "2024-09-26T02:36:58",
"upload_time_iso_8601": "2024-09-26T02:36:58.904921Z",
"url": "https://files.pythonhosted.org/packages/db/0a/abd5a7fa58246d0ac78988732dc454d1eb5d94599b49509646a27f002c6a/openicl-unlabel-0.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-26 02:36:58",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Shark-NLP",
"github_project": "OpenICL",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [],
"lcname": "openicl-unlabel"
}