# ipy-llm-kernel
ipy-llm-kernel is a Jupyter kernel that allows you to type in English language and receive responses from a large language model (LLM).
![img_1.png](docs/images/img_1.png)
It can make use of [OpenAI's chatGPT](https://openai.com/blog/openai-api), [Anthropic's Claude](https://www.anthropic.com/api), [Helmholtz' blablador](https://helmholtz-blablador.fz-juelich.de/) and [Ollama](https://ollama.com).
You need an OpenAI API, Anthropic, Google or a Helmholtz account to use it.
Using it with Ollama is free but requires running an Ollama server locally.
> [!CAUTION]
> When using the OpenAI, Google Gemini, Anthropic or any other endpoint via BiA-Bob, you are bound to the terms of service
> of the respective companies or organizations.
> The prompts you enter are transferred to their servers and may be processed and stored there.
> Make sure to not submit any sensitive, confidential or personal data. Also using these services may cost money.
## Usage
After starting `jupyter lab`, select the LLM Kernel.
![img.png](docs/images/img.png)
You can then type in English language and receive responses from the LLM as demonstrated above
### Generating images
If your prompt asks for generating images, it may generate this image for you. At the moment, only OpenAI/Dall-E3 is supported for this. You need an OpenAI API Key for this.
![img.png](docs/images/imagen_example.png)
## Installation
First, you should also create an environment variable named "IPY_LLM_KERNEL_MODEL" and enter a model name depending on which service provider you want to use. Examples:
* `llama3:8b`
* `blablador:alias:large`
* `claude-3-5-sonnet-20240620`
* `gpt-4o-2024-08-06`
Then, start a new terminal to install `ipy-llm-kernel` using pip. It is recommended to install it into via conda/mamba environment. If you have never used conda before, please [read this guide first](https://biapol.github.io/blog/mara_lampert/getting_started_with_mambaforge_and_python/readme.html).
```
pip install ipy-llm-kernel
```
Afterwards, run additionally this command:
```
python -m ipy_llm_kernel install
```
You can check if it's installed by printing out the list of installed kernels:
```
jupyter kernelspec list
```
And you can uninstall them using this command:
```
jupyter kernelspec uninstall llm-kernel
```
## Development
If you want to contribute to `ipy-llm-kernel`, you can install it in development mode like this:
```
git clone https://github.com/haesleinhuepf/ipy-llm-kernel.git
cd ipy-llm-kernel
pip install -e .
```
## Similar projects
There are similar projects:
* [jupyter-ai](https://github.com/jupyterlab/jupyter-ai)
* [chatGPT-jupyter-extension](https://github.com/jflam/chat-gpt-jupyter-extension)
* [chapyter](https://github.com/chapyter/chapyter/)
* [ipython-gpt](https://github.com/santiagobasulto/ipython-gpt)
## Issues
If you encounter any problems or want to provide feedback or suggestions, please create an [issue](https://github.com/haesleinhuepf/ipy-llm-kernel/issues) along with a detailed description and tag @haesleinhuepf .
Raw data
{
"_id": null,
"home_page": "https://github.com/haesleinhuepf/ipy-llm-kernel",
"name": "ipy-llm-kernel",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": null,
"author": "Robert Haase",
"author_email": "robert.haase@uni-leipzig.de",
"download_url": "https://files.pythonhosted.org/packages/f2/87/f284b28959a27cf372758b48d947fa1c6b2da011e21824f4eadc63d312f2/ipy_llm_kernel-0.2.0.tar.gz",
"platform": null,
"description": "# ipy-llm-kernel\r\n\r\nipy-llm-kernel is a Jupyter kernel that allows you to type in English language and receive responses from a large language model (LLM).\r\n\r\n![img_1.png](docs/images/img_1.png)\r\n\r\nIt can make use of [OpenAI's chatGPT](https://openai.com/blog/openai-api), [Anthropic's Claude](https://www.anthropic.com/api), [Helmholtz' blablador](https://helmholtz-blablador.fz-juelich.de/) and [Ollama](https://ollama.com). \r\nYou need an OpenAI API, Anthropic, Google or a Helmholtz account to use it.\r\nUsing it with Ollama is free but requires running an Ollama server locally.\r\n\r\n> [!CAUTION]\r\n> When using the OpenAI, Google Gemini, Anthropic or any other endpoint via BiA-Bob, you are bound to the terms of service \r\n> of the respective companies or organizations.\r\n> The prompts you enter are transferred to their servers and may be processed and stored there. \r\n> Make sure to not submit any sensitive, confidential or personal data. Also using these services may cost money.\r\n\r\n## Usage\r\n\r\nAfter starting `jupyter lab`, select the LLM Kernel.\r\n\r\n![img.png](docs/images/img.png)\r\n\r\nYou can then type in English language and receive responses from the LLM as demonstrated above\r\n\r\n### Generating images\r\n\r\nIf your prompt asks for generating images, it may generate this image for you. At the moment, only OpenAI/Dall-E3 is supported for this. You need an OpenAI API Key for this.\r\n\r\n![img.png](docs/images/imagen_example.png)\r\n\r\n\r\n## Installation\r\n\r\nFirst, you should also create an environment variable named \"IPY_LLM_KERNEL_MODEL\" and enter a model name depending on which service provider you want to use. Examples:\r\n* `llama3:8b`\r\n* `blablador:alias:large`\r\n* `claude-3-5-sonnet-20240620`\r\n* `gpt-4o-2024-08-06`\r\n\r\nThen, start a new terminal to install `ipy-llm-kernel` using pip. It is recommended to install it into via conda/mamba environment. If you have never used conda before, please [read this guide first](https://biapol.github.io/blog/mara_lampert/getting_started_with_mambaforge_and_python/readme.html). \r\n\r\n```\r\npip install ipy-llm-kernel\r\n```\r\n\r\nAfterwards, run additionally this command:\r\n\r\n```\r\npython -m ipy_llm_kernel install\r\n```\r\n\r\nYou can check if it's installed by printing out the list of installed kernels:\r\n\r\n```\r\njupyter kernelspec list\r\n```\r\n\r\nAnd you can uninstall them using this command:\r\n\r\n```\r\njupyter kernelspec uninstall llm-kernel\r\n```\r\n\r\n## Development\r\n\r\nIf you want to contribute to `ipy-llm-kernel`, you can install it in development mode like this:\r\n\r\n```\r\ngit clone https://github.com/haesleinhuepf/ipy-llm-kernel.git\r\ncd ipy-llm-kernel\r\npip install -e .\r\n```\r\n\r\n## Similar projects\r\n\r\nThere are similar projects:\r\n* [jupyter-ai](https://github.com/jupyterlab/jupyter-ai)\r\n* [chatGPT-jupyter-extension](https://github.com/jflam/chat-gpt-jupyter-extension)\r\n* [chapyter](https://github.com/chapyter/chapyter/)\r\n* [ipython-gpt](https://github.com/santiagobasulto/ipython-gpt)\r\n\r\n## Issues\r\n\r\nIf you encounter any problems or want to provide feedback or suggestions, please create an [issue](https://github.com/haesleinhuepf/ipy-llm-kernel/issues) along with a detailed description and tag @haesleinhuepf .\r\n\r\n\r\n\r\n\r\n\r\n",
"bugtrack_url": null,
"license": "BSD-3-Clause",
"summary": "A Jupyter kernel for communicating with large language models",
"version": "0.2.0",
"project_urls": {
"Bug Tracker": "https://github.com/haesleinhuepf/ipy-llm-kernel/issues",
"Documentation": "https://github.com/haesleinhuepf/ipy-llm-kernel#README.md",
"Homepage": "https://github.com/haesleinhuepf/ipy-llm-kernel",
"Source Code": "https://github.com/haesleinhuepf/ipy-llm-kernel",
"User Support": "https://github.com/haesleinhuepf/ipy-llm-kernel/issues"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "642ca7c396d6ba01746ab8392cc2b070902811ab7a5272a536929d775a02bd77",
"md5": "3eeb9cfa2428804c3f211f369a48389f",
"sha256": "b82046f2adfbfa0387cfa665f8404e47496d52247c0bfd6a6a6cc7cd18efbe52"
},
"downloads": -1,
"filename": "ipy_llm_kernel-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "3eeb9cfa2428804c3f211f369a48389f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 7891,
"upload_time": "2024-09-02T14:39:29",
"upload_time_iso_8601": "2024-09-02T14:39:29.014135Z",
"url": "https://files.pythonhosted.org/packages/64/2c/a7c396d6ba01746ab8392cc2b070902811ab7a5272a536929d775a02bd77/ipy_llm_kernel-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "f287f284b28959a27cf372758b48d947fa1c6b2da011e21824f4eadc63d312f2",
"md5": "fe96d5186b3ec5b3e3e7db185743b0fe",
"sha256": "5b86c7d846f62f6b203a69d9fb4eb4358714ec3d8485e8bc0464ca2058191d9a"
},
"downloads": -1,
"filename": "ipy_llm_kernel-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "fe96d5186b3ec5b3e3e7db185743b0fe",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 9342,
"upload_time": "2024-09-02T14:39:29",
"upload_time_iso_8601": "2024-09-02T14:39:29.963995Z",
"url": "https://files.pythonhosted.org/packages/f2/87/f284b28959a27cf372758b48d947fa1c6b2da011e21824f4eadc63d312f2/ipy_llm_kernel-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-02 14:39:29",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "haesleinhuepf",
"github_project": "ipy-llm-kernel",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "ipy-llm-kernel"
}