| Name | chatbot-kernel JSON |
| Version |
0.3.3
JSON |
| download |
| home_page | None |
| Summary | A Jupyter kernel using LLM models from Huggingface |
| upload_time | 2024-08-27 15:25:10 |
| maintainer | None |
| docs_url | None |
| author | None |
| requires_python | >=3.8 |
| license | BSD 3-Clause License
Copyright (c) 2024, Chia-Jung Hsu
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. |
| keywords |
|
| VCS |
 |
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
# chatbot_kernel
A Jupyter kernel to use Jupyter as a chat window, running downloaded LLMs locally.
## Installation
This kernel package can be install from PyPI
```
$ pip install chatbot_kernel
```
Once the package installed, the kernel spec can be installed to home directory by command:
```
$ python -m chatbot_kernel install --user
```
If you are using virtualenv, do
```
$ python -m chatbot_kernel install --sys-prefix
```
instead.
If you install the package to a virtual environment, you may need to set up `JUPYTER_PATH=/path/to/venv/share/jupyter:$JUPYTER_PATH` so that jupyter can find the kernel
## Usage
A few magics are available in the kernel:
- `%help`: Print help messages
- `%config`: Set advanced configuration. See more in `%config help`
- `%load`: Load a pretrained LLM
- `%hf_home`: Set the path to find downloaded LLMs, similar to set `HF_HOME` environment variable
- `%model_list`: Show the available LLMs
- `%new_chat`: Clean up the chat history
Before start chatting, you need to at least download a model from [HuggingFace](https://huggingface.co/docs/hub/models-downloading). For example, `huggingface-cli download meta-llama/Meta-Llama-3-8B-Instruct`.
Once models are downloaded, launch a jupyter notebook/lab and execute `%load <model>` and start chatting. Here is an example:
```
%load meta-llama/Meta-Llama-3-8B-Instruct
hi
who are you
```
## Caveat
Currently, the kernel use the `AutoModelForCausalLM` and it is not supported by all models.
A few models have been tested:
- `meta-llama/Meta-Llama-3-8B-Instruct`
- `mistralai/Mistral-7B-Instruct-v0.3`: needs `sentencepiece` dependency
- `unsloth/llama-3-8b-Instruct-bnb-4bit`: needs `bitsandbytes` dependency
Raw data
{
"_id": null,
"home_page": null,
"name": "chatbot-kernel",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": "\"Chia-Jung Hsu @ C3SE\" <chiajung.hsu@chalmers.se>",
"download_url": "https://files.pythonhosted.org/packages/da/18/8e79bd566c68ab0f79d0fb0bf0e1f22004891c91befa43b350e99fbfbd51/chatbot_kernel-0.3.3.tar.gz",
"platform": null,
"description": "# chatbot_kernel\nA Jupyter kernel to use Jupyter as a chat window, running downloaded LLMs locally.\n\n## Installation\nThis kernel package can be install from PyPI\n```\n$ pip install chatbot_kernel\n```\n\nOnce the package installed, the kernel spec can be installed to home directory by command:\n```\n$ python -m chatbot_kernel install --user\n```\nIf you are using virtualenv, do\n```\n$ python -m chatbot_kernel install --sys-prefix\n```\ninstead.\n\nIf you install the package to a virtual environment, you may need to set up `JUPYTER_PATH=/path/to/venv/share/jupyter:$JUPYTER_PATH` so that jupyter can find the kernel\n\n## Usage\nA few magics are available in the kernel:\n- `%help`: Print help messages\n- `%config`: Set advanced configuration. See more in `%config help`\n- `%load`: Load a pretrained LLM\n- `%hf_home`: Set the path to find downloaded LLMs, similar to set `HF_HOME` environment variable\n- `%model_list`: Show the available LLMs\n- `%new_chat`: Clean up the chat history\n\nBefore start chatting, you need to at least download a model from [HuggingFace](https://huggingface.co/docs/hub/models-downloading). For example, `huggingface-cli download meta-llama/Meta-Llama-3-8B-Instruct`.\nOnce models are downloaded, launch a jupyter notebook/lab and execute `%load <model>` and start chatting. Here is an example:\n```\n%load meta-llama/Meta-Llama-3-8B-Instruct\nhi \nwho are you\n```\n\n## Caveat\nCurrently, the kernel use the `AutoModelForCausalLM` and it is not supported by all models.\nA few models have been tested:\n- `meta-llama/Meta-Llama-3-8B-Instruct`\n- `mistralai/Mistral-7B-Instruct-v0.3`: needs `sentencepiece` dependency\n- `unsloth/llama-3-8b-Instruct-bnb-4bit`: needs `bitsandbytes` dependency\n\n",
"bugtrack_url": null,
"license": "BSD 3-Clause License\n \n Copyright (c) 2024, Chia-Jung Hsu\n \n Redistribution and use in source and binary forms, with or without\n modification, are permitted provided that the following conditions are met:\n \n 1. Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n \n 2. Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n \n 3. Neither the name of the copyright holder nor the names of its\n contributors may be used to endorse or promote products derived from\n this software without specific prior written permission.\n \n THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\n FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\n DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\n SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\n CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\n OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.",
"summary": "A Jupyter kernel using LLM models from Huggingface",
"version": "0.3.3",
"project_urls": {
"Homepage": "https://github.com/appolloford/chatbot_kernel"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "8b44e93dad51183b2f0e7eb600123bd8b0a462a2c0a7a5a2f79598c483374744",
"md5": "6d3ca017677b482e4a589e65c67487bd",
"sha256": "6f7e26dcf9acfaa3b61ed68ea12eeb7984211f31268a6d55ba345fbb7571399b"
},
"downloads": -1,
"filename": "chatbot_kernel-0.3.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "6d3ca017677b482e4a589e65c67487bd",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 9450,
"upload_time": "2024-08-27T15:25:08",
"upload_time_iso_8601": "2024-08-27T15:25:08.547660Z",
"url": "https://files.pythonhosted.org/packages/8b/44/e93dad51183b2f0e7eb600123bd8b0a462a2c0a7a5a2f79598c483374744/chatbot_kernel-0.3.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "da188e79bd566c68ab0f79d0fb0bf0e1f22004891c91befa43b350e99fbfbd51",
"md5": "32dd2872ab3bb2cdebbe971cf39d1fca",
"sha256": "b498dcf8bf053716b1e3d3d91832c9db15f011424d97fd3678c43021b8e1d3fd"
},
"downloads": -1,
"filename": "chatbot_kernel-0.3.3.tar.gz",
"has_sig": false,
"md5_digest": "32dd2872ab3bb2cdebbe971cf39d1fca",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 8821,
"upload_time": "2024-08-27T15:25:10",
"upload_time_iso_8601": "2024-08-27T15:25:10.201036Z",
"url": "https://files.pythonhosted.org/packages/da/18/8e79bd566c68ab0f79d0fb0bf0e1f22004891c91befa43b350e99fbfbd51/chatbot_kernel-0.3.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-27 15:25:10",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "appolloford",
"github_project": "chatbot_kernel",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "chatbot-kernel"
}