# Paita - Python AI Textual Assistant
<img src="https://github.com/villekr/paita/blob/main/imgs/paita.jpg?raw=true" width="800">
Paita is textual assistant for your terminal that supports multiple AI Services and models.
## Key Features
- **Supports Multiple AI Services:** Paita integrates with a variety of AI services through the [LangChain](https://python.langchain.com) library. If AI service is compatible with LangChain then it can be used also with Paita.
- **Textual User Interface on your terminal:** Paita is based on [Textual](https://textual.textualize.io/) and provides a sophisticated user interface right within your terminal, combining the complexity of a GUI with console simplicity.
- **Cross-Platform Compatibility:** Paita is compatible with Windows, macOS, and Linux systems across most terminals; if Python runs in your environment and Textual supports it, then Paita will work.
- **Supports Retrieval-Augmented Generation (RAG):** Paita supports local vectorstore (Chroma) and crawling web page content.
### Supported AI Services
* OpenAI
* AWS Bedrock
* Ollama (local models)
## Getting Started
### Prerequisites
- Python 3.8.1+
- Access to AI Service and configured in terminal
### Installation and running
Install using pip (or pipx)
```
pip install paita
```
Run and enjoy!
```
paita
```
### Some keyboard shortcuts
Paita is textual ui application so using keyboard shortcuts is recommended:
* Use `tab` and `shift`+`tab` to navigate between input field, send-button and question/answer boxes
* While question/answer box is focus use `enter` to "focus-in" and `esc` to "focus-out"
* Use `c` to copy content from question/answer box
* Contextual keyboard shortcuts are shown at the bottom of the UI
### Configuring AI Service(s) and model access
#### OpenAI
OpenAI usage requires valid api key in environment variable.
```
export OPENAI_API_KEY=<OpenAI API Key>
```
#### AWS Bedrock
Enable AI model access in AWS Bedrock. Configure aws credential access accordingly.
#### Ollama
Ollama enables running chat models locally.
Install [ollama](https://ollama.com) for operating system or use official (docker image)[https://hub.docker.com/r/ollama/ollama]
Once ollama installed pull desired [model](https://ollama.com/library) from a registry e.g.
```
ollama pull llama3
```
By default paita connects to local Ollama endpoint. Optionally you can configure endpoint url with env variable:
```
export OLLAMA_ENDPOINT=<protocol>://<ollama-host-address>:<ollama-host-port>
```
## Feedback
* [Issues](https://github.com/villekr/paita/issues)
* [Discussion](https://github.com/villekr/paita/discussions)
Raw data
{
"_id": null,
"home_page": null,
"name": "paita",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "ai, aws, bedrock, chat, chatgpt, openai, python, textual",
"author": null,
"author_email": "Ville K\u00e4rkk\u00e4inen <ville.karkkainen@outlook.com>",
"download_url": "https://files.pythonhosted.org/packages/cf/4a/79731eae50b3c96b64473f33fc404826a82a3d91a3cd7cd781b6cd345f29/paita-0.1.13.tar.gz",
"platform": null,
"description": "# Paita - Python AI Textual Assistant\n<img src=\"https://github.com/villekr/paita/blob/main/imgs/paita.jpg?raw=true\" width=\"800\">\n\nPaita is textual assistant for your terminal that supports multiple AI Services and models.\n\n## Key Features\n- **Supports Multiple AI Services:** Paita integrates with a variety of AI services through the [LangChain](https://python.langchain.com) library. If AI service is compatible with LangChain then it can be used also with Paita.\n- **Textual User Interface on your terminal:** Paita is based on [Textual](https://textual.textualize.io/) and provides a sophisticated user interface right within your terminal, combining the complexity of a GUI with console simplicity. \n- **Cross-Platform Compatibility:** Paita is compatible with Windows, macOS, and Linux systems across most terminals; if Python runs in your environment and Textual supports it, then Paita will work.\n- **Supports Retrieval-Augmented Generation (RAG):** Paita supports local vectorstore (Chroma) and crawling web page content.\n\n### Supported AI Services\n* OpenAI\n* AWS Bedrock\n* Ollama (local models)\n\n## Getting Started\n\n### Prerequisites\n- Python 3.8.1+\n- Access to AI Service and configured in terminal\n\n### Installation and running\n\nInstall using pip (or pipx)\n```\npip install paita\n```\n\nRun and enjoy!\n```\npaita\n```\n\n### Some keyboard shortcuts\n\nPaita is textual ui application so using keyboard shortcuts is recommended:\n* Use `tab` and `shift`+`tab` to navigate between input field, send-button and question/answer boxes\n* While question/answer box is focus use `enter` to \"focus-in\" and `esc` to \"focus-out\"\n* Use `c` to copy content from question/answer box\n* Contextual keyboard shortcuts are shown at the bottom of the UI\n\n### Configuring AI Service(s) and model access\n\n#### OpenAI\n\nOpenAI usage requires valid api key in environment variable.\n```\nexport OPENAI_API_KEY=<OpenAI API Key>\n```\n\n#### AWS Bedrock\n\nEnable AI model access in AWS Bedrock. Configure aws credential access accordingly.\n\n#### Ollama\n\nOllama enables running chat models locally. \n\nInstall [ollama](https://ollama.com) for operating system or use official (docker image)[https://hub.docker.com/r/ollama/ollama] \n\nOnce ollama installed pull desired [model](https://ollama.com/library) from a registry e.g.\n```\nollama pull llama3\n```\n\nBy default paita connects to local Ollama endpoint. Optionally you can configure endpoint url with env variable:\n```\nexport OLLAMA_ENDPOINT=<protocol>://<ollama-host-address>:<ollama-host-port>\n```\n\n## Feedback\n\n* [Issues](https://github.com/villekr/paita/issues)\n* [Discussion](https://github.com/villekr/paita/discussions)\n",
"bugtrack_url": null,
"license": null,
"summary": "paita - Python AI Textual Assistant",
"version": "0.1.13",
"project_urls": {
"Documentation": "https://github.com/villekr/paita#readme",
"Issues": "https://github.com/villekr/paita/issues",
"Source": "https://github.com/villekr/paita"
},
"split_keywords": [
"ai",
" aws",
" bedrock",
" chat",
" chatgpt",
" openai",
" python",
" textual"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "1df19d277eee573a9e045f32dbac6eec2ad3ab9c3313fbe83bb767fbf7bf6036",
"md5": "e3eaeecafbbb9bc2f31149f3bd70adcd",
"sha256": "1b246fbc1a901f088578ff13d84c983629a55626f2cdd94aeaaad6bc020e7411"
},
"downloads": -1,
"filename": "paita-0.1.13-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e3eaeecafbbb9bc2f31149f3bd70adcd",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 32999,
"upload_time": "2024-09-02T05:52:19",
"upload_time_iso_8601": "2024-09-02T05:52:19.094647Z",
"url": "https://files.pythonhosted.org/packages/1d/f1/9d277eee573a9e045f32dbac6eec2ad3ab9c3313fbe83bb767fbf7bf6036/paita-0.1.13-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "cf4a79731eae50b3c96b64473f33fc404826a82a3d91a3cd7cd781b6cd345f29",
"md5": "c54cf3279b8dc3eefe091d2a25ddd8f4",
"sha256": "38da89637e85d4e2824208f46189e2f78d9061c684ee867fc662e65803186220"
},
"downloads": -1,
"filename": "paita-0.1.13.tar.gz",
"has_sig": false,
"md5_digest": "c54cf3279b8dc3eefe091d2a25ddd8f4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 313993,
"upload_time": "2024-09-02T05:52:20",
"upload_time_iso_8601": "2024-09-02T05:52:20.754210Z",
"url": "https://files.pythonhosted.org/packages/cf/4a/79731eae50b3c96b64473f33fc404826a82a3d91a3cd7cd781b6cd345f29/paita-0.1.13.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-02 05:52:20",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "villekr",
"github_project": "paita#readme",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "paita"
}