LocalAssistant


NameLocalAssistant JSON
Version 1.0.1 PyPI version JSON
download
home_pageNone
SummaryLocalAssistant (locas) is an AI designed to be used in CLI. (Currently in development)
upload_time2024-11-30 16:06:53
maintainerNone
docs_urlNone
authorLinos
requires_python>=3.10
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="center">

# LocalAssistant

**Locas - your local assistant**

[![][latest-release-shield]][latest-release-url]
[![][latest-commit-shield]][latest-commit-url]
[![][python-shield]][python-url]

[latest-release-shield]: https://badgen.net/github/release/Linos1391/LocalAssistant/development?icon=github
[latest-release-url]: https://github.com/Linos1391/LocalAssistant/releases/latest
[latest-commit-shield]: https://badgen.net/github/last-commit/Linos1391/LocalAssistant/main?icon=github
[latest-commit-url]: https://github.com/Linos1391/LocalAssistant/commits/main
[python-shield]: https://img.shields.io/badge/python-3.10+-yellow
[python-url]: https://www.python.org/downloads/

This AI is designed to be used in CLI.

</div>

# Which one should I use?
- [Pypi version](#download-by-pypi-recommended) is great, it works how I want. But if you want projects to be organized by using Anaconda / Docker... It sucks.
- [Github version](#download-by-github) solves that by using PATH, then user may modify `locas.cmd` file to use Anaconda. However, Unix user have to type `locas.cmd` instead of `locas`.

**Summary:** Window user may use Github version while Pypi is for Unix user. I still recommended Pypi though.

<br>

# Download by GitHub:

Visit [Github](https://github.com/Linos1391/LocalAssistant) and follow the instuctrion.

<br>

# Download by Pypi: (Recommended)

## Installing

Visit [PyTorch](https://pytorch.org/get-started/locally/) and download the version for your device.

```
# Example: (Me using WINDOW with CUDA 12.4)

pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu124
```

After that, pip install the AI.

```
pip install LocalAssistant
```

<br>

## Preparing

### Chatting:

Before doing anything, we should download a model first.

```
locas download -n qwen Qwen/Qwen2.5-1.5B-Instruct 1
```

We will use `locas start` for AI's memory.

```
locas start
```

### Chatting with memory:

Before doing anything, we should download a model first.

```
locas download -n allmpnetv2 sentence-transformers/all-mpnet-base-v2 2
```

Memory only allow on `locas start`, remember that. Anyway, let's dive into it!

```
locas start -m
```

<br>

## Running

```
locas ...
```

Use `locas -h` for more.

<br>

## Removing

**Warning:** This act will delete all LocalAssistant files.

```
locas self-destruction pip
```

<br>

## Disclaimer

This AI was designed to communicating with Hugging Face models in CLI. Please do not use this AI for any unethical reasons. Any damages from abusing this application will not be the responsibility of the author.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "LocalAssistant",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": null,
    "author": "Linos",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/c6/5a/f1191b787e0142e88706736083e4feeff8dccb2bf62b9f9b7e1d1c1ab77b/localassistant-1.0.1.tar.gz",
    "platform": null,
    "description": "<div align=\"center\">\r\n\r\n# LocalAssistant\r\n\r\n**Locas - your local assistant**\r\n\r\n[![][latest-release-shield]][latest-release-url]\r\n[![][latest-commit-shield]][latest-commit-url]\r\n[![][python-shield]][python-url]\r\n\r\n[latest-release-shield]: https://badgen.net/github/release/Linos1391/LocalAssistant/development?icon=github\r\n[latest-release-url]: https://github.com/Linos1391/LocalAssistant/releases/latest\r\n[latest-commit-shield]: https://badgen.net/github/last-commit/Linos1391/LocalAssistant/main?icon=github\r\n[latest-commit-url]: https://github.com/Linos1391/LocalAssistant/commits/main\r\n[python-shield]: https://img.shields.io/badge/python-3.10+-yellow\r\n[python-url]: https://www.python.org/downloads/\r\n\r\nThis AI is designed to be used in CLI.\r\n\r\n</div>\r\n\r\n# Which one should I use?\r\n- [Pypi version](#download-by-pypi-recommended) is great, it works how I want. But if you want projects to be organized by using Anaconda / Docker... It sucks.\r\n- [Github version](#download-by-github) solves that by using PATH, then user may modify `locas.cmd` file to use Anaconda. However, Unix user have to type `locas.cmd` instead of `locas`.\r\n\r\n**Summary:** Window user may use Github version while Pypi is for Unix user. I still recommended Pypi though.\r\n\r\n<br>\r\n\r\n# Download by GitHub:\r\n\r\nVisit [Github](https://github.com/Linos1391/LocalAssistant) and follow the instuctrion.\r\n\r\n<br>\r\n\r\n# Download by Pypi: (Recommended)\r\n\r\n## Installing\r\n\r\nVisit [PyTorch](https://pytorch.org/get-started/locally/) and download the version for your device.\r\n\r\n```\r\n# Example: (Me using WINDOW with CUDA 12.4)\r\n\r\npip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu124\r\n```\r\n\r\nAfter that, pip install the AI.\r\n\r\n```\r\npip install LocalAssistant\r\n```\r\n\r\n<br>\r\n\r\n## Preparing\r\n\r\n### Chatting:\r\n\r\nBefore doing anything, we should download a model first.\r\n\r\n```\r\nlocas download -n qwen Qwen/Qwen2.5-1.5B-Instruct 1\r\n```\r\n\r\nWe will use `locas start` for AI's memory.\r\n\r\n```\r\nlocas start\r\n```\r\n\r\n### Chatting with memory:\r\n\r\nBefore doing anything, we should download a model first.\r\n\r\n```\r\nlocas download -n allmpnetv2 sentence-transformers/all-mpnet-base-v2 2\r\n```\r\n\r\nMemory only allow on `locas start`, remember that. Anyway, let's dive into it!\r\n\r\n```\r\nlocas start -m\r\n```\r\n\r\n<br>\r\n\r\n## Running\r\n\r\n```\r\nlocas ...\r\n```\r\n\r\nUse `locas -h` for more.\r\n\r\n<br>\r\n\r\n## Removing\r\n\r\n**Warning:** This act will delete all LocalAssistant files.\r\n\r\n```\r\nlocas self-destruction pip\r\n```\r\n\r\n<br>\r\n\r\n## Disclaimer\r\n\r\nThis AI was designed to communicating with Hugging Face models in CLI. Please do not use this AI for any unethical reasons. Any damages from abusing this application will not be the responsibility of the author.\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "LocalAssistant (locas) is an AI designed to be used in CLI. (Currently in development)",
    "version": "1.0.1",
    "project_urls": {
        "Source": "https://github.com/Linos1391/LocalAssistant"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c4266266c816ded3190c6b03e990042ef3580537fada69a1c88eab701b7b9b61",
                "md5": "3765b31d459a412ad5a8df36b23e2575",
                "sha256": "95a19494e127df8a2642fedd44c8027c2fd1a1357f3ef202cf3a508ffe878f58"
            },
            "downloads": -1,
            "filename": "LocalAssistant-1.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3765b31d459a412ad5a8df36b23e2575",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 16750,
            "upload_time": "2024-11-30T16:06:20",
            "upload_time_iso_8601": "2024-11-30T16:06:20.113562Z",
            "url": "https://files.pythonhosted.org/packages/c4/26/6266c816ded3190c6b03e990042ef3580537fada69a1c88eab701b7b9b61/LocalAssistant-1.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c65af1191b787e0142e88706736083e4feeff8dccb2bf62b9f9b7e1d1c1ab77b",
                "md5": "a185ea223477a55f407624a9d30d3fd3",
                "sha256": "58a9daf95174abe5863c97624fe1842d83f9d0a88dc17835f7ee0c80e845de79"
            },
            "downloads": -1,
            "filename": "localassistant-1.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "a185ea223477a55f407624a9d30d3fd3",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 15116,
            "upload_time": "2024-11-30T16:06:53",
            "upload_time_iso_8601": "2024-11-30T16:06:53.712484Z",
            "url": "https://files.pythonhosted.org/packages/c6/5a/f1191b787e0142e88706736083e4feeff8dccb2bf62b9f9b7e1d1c1ab77b/localassistant-1.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-30 16:06:53",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Linos1391",
    "github_project": "LocalAssistant",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "localassistant"
}
        
Elapsed time: 0.42760s