# Python GPT4All
This package contains a set of Python bindings around the `llmodel` C-API.
Package on PyPI: https://pypi.org/project/gpt4all/
## Documentation
https://docs.gpt4all.io/gpt4all_python.html
## Installation
The easiest way to install the Python bindings for GPT4All is to use pip:
```
pip install gpt4all
```
This will download the latest version of the `gpt4all` package from PyPI.
## Local Build
As an alternative to downloading via pip, you may build the Python bindings from source.
### Prerequisites
You will need a compiler. On Windows, you should install Visual Studio with the C++ Development components. On macOS, you will need the full version of Xcode—Xcode Command Line Tools lacks certain required tools. On Linux, you will need a GCC or Clang toolchain with C++ support.
On Windows and Linux, building GPT4All with full GPU support requires the [Vulkan SDK](https://vulkan.lunarg.com/sdk/home) and the latest [CUDA Toolkit](https://developer.nvidia.com/cuda-downloads).
### Building the python bindings
1. Clone GPT4All and change directory:
```
git clone --recurse-submodules https://github.com/nomic-ai/gpt4all.git
cd gpt4all/gpt4all-backend
```
2. Build the backend.
If you are using Windows and have Visual Studio installed:
```
cmake -B build
cmake --build build --parallel --config RelWithDebInfo
```
For all other platforms:
```
cmake -B build -DCMAKE_BUILD_TYPE=RelWithDebInfo
cmake --build build --parallel
```
`RelWithDebInfo` is a good default, but you can also use `Release` or `Debug` depending on the situation.
2. Install the Python package:
```
cd ../gpt4all-bindings/python
pip install -e .
```
## Usage
Test it out! In a Python script or console:
```python
from gpt4all import GPT4All
model = GPT4All("orca-mini-3b-gguf2-q4_0.gguf")
output = model.generate("The capital of France is ", max_tokens=3)
print(output)
```
GPU Usage
```python
from gpt4all import GPT4All
model = GPT4All("orca-mini-3b-gguf2-q4_0.gguf", device='gpu') # device='amd', device='intel'
output = model.generate("The capital of France is ", max_tokens=3)
print(output)
```
## Troubleshooting a Local Build
- If you're on Windows and have compiled with a MinGW toolchain, you might run into an error like:
```
FileNotFoundError: Could not find module '<...>\gpt4all-bindings\python\gpt4all\llmodel_DO_NOT_MODIFY\build\libllmodel.dll'
(or one of its dependencies). Try using the full path with constructor syntax.
```
The key phrase in this case is _"or one of its dependencies"_. The Python interpreter you're using
probably doesn't see the MinGW runtime dependencies. At the moment, the following three are required:
`libgcc_s_seh-1.dll`, `libstdc++-6.dll` and `libwinpthread-1.dll`. You should copy them from MinGW
into a folder where Python will see them, preferably next to `libllmodel.dll`.
- Note regarding the Microsoft toolchain: Compiling with MSVC is possible, but not the official way to
go about it at the moment. MSVC doesn't produce DLLs with a `lib` prefix, which the bindings expect.
You'd have to amend that yourself.
Raw data
{
"_id": null,
"home_page": "https://gpt4all.io/",
"name": "gpt4all",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": null,
"author": "Nomic and the Open Source Community",
"author_email": "support@nomic.ai",
"download_url": null,
"platform": null,
"description": "# Python GPT4All\n\nThis package contains a set of Python bindings around the `llmodel` C-API.\n\nPackage on PyPI: https://pypi.org/project/gpt4all/\n\n## Documentation\nhttps://docs.gpt4all.io/gpt4all_python.html\n\n## Installation\n\nThe easiest way to install the Python bindings for GPT4All is to use pip:\n\n```\npip install gpt4all\n```\n\nThis will download the latest version of the `gpt4all` package from PyPI.\n\n## Local Build\n\nAs an alternative to downloading via pip, you may build the Python bindings from source.\n\n### Prerequisites\n\nYou will need a compiler. On Windows, you should install Visual Studio with the C++ Development components. On macOS, you will need the full version of Xcode—Xcode Command Line Tools lacks certain required tools. On Linux, you will need a GCC or Clang toolchain with C++ support.\n\nOn Windows and Linux, building GPT4All with full GPU support requires the [Vulkan SDK](https://vulkan.lunarg.com/sdk/home) and the latest [CUDA Toolkit](https://developer.nvidia.com/cuda-downloads).\n\n### Building the python bindings\n\n1. Clone GPT4All and change directory:\n```\ngit clone --recurse-submodules https://github.com/nomic-ai/gpt4all.git\ncd gpt4all/gpt4all-backend\n```\n\n2. Build the backend.\n\nIf you are using Windows and have Visual Studio installed:\n```\ncmake -B build\ncmake --build build --parallel --config RelWithDebInfo\n```\n\nFor all other platforms:\n```\ncmake -B build -DCMAKE_BUILD_TYPE=RelWithDebInfo\ncmake --build build --parallel\n```\n\n`RelWithDebInfo` is a good default, but you can also use `Release` or `Debug` depending on the situation.\n\n2. Install the Python package:\n```\ncd ../gpt4all-bindings/python\npip install -e .\n```\n\n## Usage\n\nTest it out! In a Python script or console:\n\n```python\nfrom gpt4all import GPT4All\nmodel = GPT4All(\"orca-mini-3b-gguf2-q4_0.gguf\")\noutput = model.generate(\"The capital of France is \", max_tokens=3)\nprint(output)\n```\n\n\nGPU Usage\n```python\nfrom gpt4all import GPT4All\nmodel = GPT4All(\"orca-mini-3b-gguf2-q4_0.gguf\", device='gpu') # device='amd', device='intel'\noutput = model.generate(\"The capital of France is \", max_tokens=3)\nprint(output)\n```\n\n## Troubleshooting a Local Build\n- If you're on Windows and have compiled with a MinGW toolchain, you might run into an error like:\n ```\n FileNotFoundError: Could not find module '<...>\\gpt4all-bindings\\python\\gpt4all\\llmodel_DO_NOT_MODIFY\\build\\libllmodel.dll'\n (or one of its dependencies). Try using the full path with constructor syntax.\n ```\n The key phrase in this case is _\"or one of its dependencies\"_. The Python interpreter you're using\n probably doesn't see the MinGW runtime dependencies. At the moment, the following three are required:\n `libgcc_s_seh-1.dll`, `libstdc++-6.dll` and `libwinpthread-1.dll`. You should copy them from MinGW\n into a folder where Python will see them, preferably next to `libllmodel.dll`.\n\n- Note regarding the Microsoft toolchain: Compiling with MSVC is possible, but not the official way to\n go about it at the moment. MSVC doesn't produce DLLs with a `lib` prefix, which the bindings expect.\n You'd have to amend that yourself.\n",
"bugtrack_url": null,
"license": null,
"summary": "Python bindings for GPT4All",
"version": "2.8.2",
"project_urls": {
"Documentation": "https://docs.gpt4all.io/gpt4all_python.html",
"Homepage": "https://gpt4all.io/",
"Source code": "https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-bindings/python"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "1992bad46a9bc5b993727045d09ce35e2dcc3939be02f47acfc99b2be0d33b33",
"md5": "a798ac37c060185a670b519a96762720",
"sha256": "41d6de60fe1f9c9a29bcbb585c5eb39083893cf00db0b836de1f2d88e2fbbb17"
},
"downloads": -1,
"filename": "gpt4all-2.8.2-py3-none-macosx_10_15_universal2.whl",
"has_sig": false,
"md5_digest": "a798ac37c060185a670b519a96762720",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 6622150,
"upload_time": "2024-08-14T18:21:13",
"upload_time_iso_8601": "2024-08-14T18:21:13.648984Z",
"url": "https://files.pythonhosted.org/packages/19/92/bad46a9bc5b993727045d09ce35e2dcc3939be02f47acfc99b2be0d33b33/gpt4all-2.8.2-py3-none-macosx_10_15_universal2.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9360ef1efd83c5757a69e8ed842b8a12b091e94a9bb2ceacc8b6e1c2e6487a77",
"md5": "e44b87956d6283fda8ebd128082d55fd",
"sha256": "c61e977afa72475c076211fd9b59a88334fc0062615be8c2f7716363a21fbafa"
},
"downloads": -1,
"filename": "gpt4all-2.8.2-py3-none-manylinux1_x86_64.whl",
"has_sig": false,
"md5_digest": "e44b87956d6283fda8ebd128082d55fd",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 121614072,
"upload_time": "2024-08-14T18:21:21",
"upload_time_iso_8601": "2024-08-14T18:21:21.736892Z",
"url": "https://files.pythonhosted.org/packages/93/60/ef1efd83c5757a69e8ed842b8a12b091e94a9bb2ceacc8b6e1c2e6487a77/gpt4all-2.8.2-py3-none-manylinux1_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "0492dd9dd077f0fc61218e4116a7154958eff36f06893506ad3290802218b1c9",
"md5": "ab7df157584bcbd49fe6d3a749392e36",
"sha256": "a164674943df732808266e5bf63332fadef95eac802c201b47c7b378e5bd9f45"
},
"downloads": -1,
"filename": "gpt4all-2.8.2-py3-none-win_amd64.whl",
"has_sig": false,
"md5_digest": "ab7df157584bcbd49fe6d3a749392e36",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 119593619,
"upload_time": "2024-08-14T18:21:31",
"upload_time_iso_8601": "2024-08-14T18:21:31.375008Z",
"url": "https://files.pythonhosted.org/packages/04/92/dd9dd077f0fc61218e4116a7154958eff36f06893506ad3290802218b1c9/gpt4all-2.8.2-py3-none-win_amd64.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-14 18:21:13",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "nomic-ai",
"github_project": "gpt4all",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"circle": true,
"lcname": "gpt4all"
}