# mlipy
<!--
[![Build][build-image]]()
[![Status][status-image]][pypi-project-url]
[![Stable Version][stable-ver-image]][pypi-project-url]
[![Coverage][coverage-image]]()
[![Python][python-ver-image]][pypi-project-url]
[![License][mit-image]][mit-url]
-->
[](https://pypistats.org/packages/mlipy)
[](https://pypi.org/project/mlipy)
[](https://opensource.org/licenses/MIT)
Pure **Python**-based **Machine Learning Interface** for multiple engines with multi-modal support.
<!--
Python HTTP Server/Client (including WebSocket streaming support) for:
- [candle](https://github.com/huggingface/candle)
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
- [LangChain](https://python.langchain.com)
-->
Python HTTP Server/Client (including WebSocket streaming support) for:
- [llama.cpp](https://github.com/ggerganov/llama.cpp)
- [LangChain](https://python.langchain.com)
# Prerequisites
## Debian/Ubuntu
```bash
sudo apt update -y
sudo apt install build-essential git curl libssl-dev libffi-dev pkg-config
```
<!--
### Rust
1) Using latest system repository:
```bash
sudo apt install rustc cargo
```
2) Install rustup using official instructions:
```bash
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
source "$HOME/.cargo/env"
rustup default stable
```
-->
### Python
1) Install Python using internal repository:
```bash
sudo apt install python3.11 python3.11-dev python3.11-venv
```
2) Install Python using external repository:
```bash
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update -y
sudo apt install python3.11 python3.11-dev python3.11-venv
```
<!--
## Arch/Manjaro
### Rust
1) Using latest system-wide rust/cargo:
```bash
sudo pacman -Sy base-devel openssl libffi git rust cargo rust-wasm wasm-bindgen
```
2) Using latest rustup:
```bash
sudo pacman -Sy base-devel openssl libffi git rustup
rustup default stable
```
## macOS
```bash
brew update
brew install rustup
rustup default stable
```
-->
# llama.cpp
```bash
cd ~
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
make -j
```
<!--
# candle
```bash
cd ~
git clone https://github.com/huggingface/candle.git
cd candle
find candle-examples/examples/llama/main.rs -type f -exec sed -i 's/print!("{prompt}")/eprint!("{prompt}")/g' {} +
find candle-examples/examples/phi/main.rs -type f -exec sed -i 's/print!("{prompt}")/eprint!("{prompt}")/g' {} +
find candle-examples/examples/mistral/main.rs -type f -exec sed -i -E 's/print\\!\\("\\{t\\}"\\)$/eprint\\!\\("\\{t\\}"\\)/g' {} +
find candle-examples/examples/stable-lm/main.rs -type f -exec sed -i -E 's/print\\!\\("\\{t\\}"\\)$/eprint\\!\\("\\{t\\}"\\)/g' {} +
find candle-examples -type f -exec sed -i 's/println/eprintln/g' {} +
cargo clean
```
CPU:
```bash
cargo build -r --bins --examples
```
GPU / CUDA:
```bash
cargo build --features cuda -r --bins --examples
```
-->
# Run Development Server
Setup virtualenv and install requirements:
```bash
git clone https://github.com/mtasic85/mlipy.git
cd mlipy
python3.11 -m venv venv
source venv/bin/activate
pip install poetry
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu
poetry install
```
Run server:
```bash
python -B -m mli.server --llama-cpp-path='~/llama.cpp'
```
# Run Examples
Using GPU:
```bash
NGL=99 python -B examples/sync_demo.py
```
Using CPU:
```bash
python -B examples/sync_demo.py
python -B examples/async_demo.py
python -B examples/langchain_sync_demo.py
python -B examples/langchain_async_demo.py
```
# Run Production Server
## Generate self-signed SSL certificates
```bash
openssl req -x509 -nodes -newkey rsa:4096 -keyout key.pem -out cert.pem -days 365
```
## Run
```bash
python3.11 -m venv venv
source venv/bin/activate
pip install -U mlipy
python -B -m mli.server
```
Raw data
{
"_id": null,
"home_page": "https://github.com/tangledgroup/mlipy",
"name": "mlipy",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.11",
"maintainer_email": null,
"keywords": null,
"author": "Marko Tasic",
"author_email": "mtasic85@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/56/a7/e9786571bb3cebe5d7d6b846812f195a4c9117c53d14dc31b4fbca722c8f/mlipy-0.1.57.tar.gz",
"platform": null,
"description": "# mlipy\n\n<!--\n[![Build][build-image]]()\n[![Status][status-image]][pypi-project-url]\n[![Stable Version][stable-ver-image]][pypi-project-url]\n[![Coverage][coverage-image]]()\n[![Python][python-ver-image]][pypi-project-url]\n[![License][mit-image]][mit-url]\n-->\n[](https://pypistats.org/packages/mlipy)\n[](https://pypi.org/project/mlipy)\n[](https://opensource.org/licenses/MIT)\n\nPure **Python**-based **Machine Learning Interface** for multiple engines with multi-modal support.\n\n<!--\nPython HTTP Server/Client (including WebSocket streaming support) for:\n- [candle](https://github.com/huggingface/candle)\n- [llama.cpp](https://github.com/ggerganov/llama.cpp)\n- [LangChain](https://python.langchain.com)\n-->\n\nPython HTTP Server/Client (including WebSocket streaming support) for:\n- [llama.cpp](https://github.com/ggerganov/llama.cpp)\n- [LangChain](https://python.langchain.com)\n\n\n# Prerequisites\n\n## Debian/Ubuntu\n\n```bash\nsudo apt update -y\nsudo apt install build-essential git curl libssl-dev libffi-dev pkg-config\n```\n\n<!--\n### Rust\n\n1) Using latest system repository:\n\n```bash\nsudo apt install rustc cargo\n```\n\n2) Install rustup using official instructions:\n\n```bash\ncurl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh\nsource \"$HOME/.cargo/env\"\nrustup default stable\n```\n-->\n\n### Python\n\n1) Install Python using internal repository:\n```bash\nsudo apt install python3.11 python3.11-dev python3.11-venv\n```\n\n2) Install Python using external repository:\n```bash\nsudo add-apt-repository ppa:deadsnakes/ppa\nsudo apt update -y\nsudo apt install python3.11 python3.11-dev python3.11-venv\n```\n\n\n<!--\n## Arch/Manjaro\n\n### Rust\n\n1) Using latest system-wide rust/cargo:\n```bash\nsudo pacman -Sy base-devel openssl libffi git rust cargo rust-wasm wasm-bindgen\n```\n\n2) Using latest rustup:\n```bash\nsudo pacman -Sy base-devel openssl libffi git rustup\nrustup default stable\n```\n\n\n## macOS\n\n\n```bash\nbrew update\nbrew install rustup\nrustup default stable\n```\n-->\n\n# llama.cpp\n\n```bash\ncd ~\ngit clone https://github.com/ggerganov/llama.cpp.git\ncd llama.cpp\nmake -j\n```\n\n\n<!--\n# candle\n\n```bash\ncd ~\ngit clone https://github.com/huggingface/candle.git\ncd candle\nfind candle-examples/examples/llama/main.rs -type f -exec sed -i 's/print!(\"{prompt}\")/eprint!(\"{prompt}\")/g' {} +\nfind candle-examples/examples/phi/main.rs -type f -exec sed -i 's/print!(\"{prompt}\")/eprint!(\"{prompt}\")/g' {} +\nfind candle-examples/examples/mistral/main.rs -type f -exec sed -i -E 's/print\\\\!\\\\(\"\\\\{t\\\\}\"\\\\)$/eprint\\\\!\\\\(\"\\\\{t\\\\}\"\\\\)/g' {} +\nfind candle-examples/examples/stable-lm/main.rs -type f -exec sed -i -E 's/print\\\\!\\\\(\"\\\\{t\\\\}\"\\\\)$/eprint\\\\!\\\\(\"\\\\{t\\\\}\"\\\\)/g' {} +\nfind candle-examples -type f -exec sed -i 's/println/eprintln/g' {} +\ncargo clean\n```\n\nCPU:\n```bash\ncargo build -r --bins --examples\n```\n\nGPU / CUDA:\n```bash\ncargo build --features cuda -r --bins --examples\n```\n-->\n\n\n# Run Development Server\n\nSetup virtualenv and install requirements:\n\n```bash\ngit clone https://github.com/mtasic85/mlipy.git\ncd mlipy\n\npython3.11 -m venv venv\nsource venv/bin/activate\npip install poetry\npip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu\npoetry install\n```\n\nRun server:\n\n```bash\npython -B -m mli.server --llama-cpp-path='~/llama.cpp'\n```\n\n\n# Run Examples\n\nUsing GPU:\n\n```bash\nNGL=99 python -B examples/sync_demo.py\n```\n\nUsing CPU:\n\n```bash\npython -B examples/sync_demo.py\npython -B examples/async_demo.py\npython -B examples/langchain_sync_demo.py\npython -B examples/langchain_async_demo.py\n```\n\n\n# Run Production Server\n\n## Generate self-signed SSL certificates\n\n```bash\nopenssl req -x509 -nodes -newkey rsa:4096 -keyout key.pem -out cert.pem -days 365\n```\n\n\n## Run\n\n```bash\npython3.11 -m venv venv\nsource venv/bin/activate\npip install -U mlipy\npython -B -m mli.server\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Python-based Machine Learning Interface",
"version": "0.1.57",
"project_urls": {
"Homepage": "https://github.com/tangledgroup/mlipy",
"Repository": "https://github.com/tangledgroup/mlipy"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ca00ab408b095247f9501d0ecc5fd9ecf05c9a2d887aadc33bc17529145864ba",
"md5": "ec2f6e2efff47a4a1fc0cb06344648e2",
"sha256": "d2f0acd51fc00cdea92304f214f3b974b5fd268c5c8a5cdfb385d756a59ef234"
},
"downloads": -1,
"filename": "mlipy-0.1.57-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ec2f6e2efff47a4a1fc0cb06344648e2",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.11",
"size": 15560,
"upload_time": "2024-07-24T08:04:03",
"upload_time_iso_8601": "2024-07-24T08:04:03.329481Z",
"url": "https://files.pythonhosted.org/packages/ca/00/ab408b095247f9501d0ecc5fd9ecf05c9a2d887aadc33bc17529145864ba/mlipy-0.1.57-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "56a7e9786571bb3cebe5d7d6b846812f195a4c9117c53d14dc31b4fbca722c8f",
"md5": "7bb3211f22923acd7b937ed2a46baa38",
"sha256": "7fabc2e10c73db757ea7dcc343f43aaf2a31fe92f64fbe856797a69c8ab7d817"
},
"downloads": -1,
"filename": "mlipy-0.1.57.tar.gz",
"has_sig": false,
"md5_digest": "7bb3211f22923acd7b937ed2a46baa38",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.11",
"size": 15061,
"upload_time": "2024-07-24T08:04:05",
"upload_time_iso_8601": "2024-07-24T08:04:05.257740Z",
"url": "https://files.pythonhosted.org/packages/56/a7/e9786571bb3cebe5d7d6b846812f195a4c9117c53d14dc31b4fbca722c8f/mlipy-0.1.57.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-07-24 08:04:05",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "tangledgroup",
"github_project": "mlipy",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "mlipy"
}