# Hayloft
[![Downloads](https://static.pepy.tech/badge/hayloft/month)](https://pepy.tech/project/hayloft)
[![](https://dcbadge.vercel.app/api/server/EKewT5cYMy?compact=true&style=flat)](https://discord.gg/EKewT5cYMy)
[![Twitter](https://img.shields.io/twitter/url/https/twitter.com/eaturchenkov.svg?style=social&label=Follow%20%40eaturchenkov)](https://twitter.com/eaturchenkov)
UI tool for LLM frameworks to make easy prompt/completion tracking, store and comparison of different sessions.
- [Installation](#installation)
- [Usage](#usage)
- [LlamaIndex](#llamaindex)
- [BabyAGI](#babyagi)
## Installation
Install package with pip
```
pip install hayloft
```
## Usage
Start hayloft server
```
hayloft start
```
Trace logs of your script on [http://localhost:7000](http://localhost:7000)
### LlamaIndex
Install `llama-index`, create `example.py` file as below. Put [examples](https://github.com/jerryjliu/llama_index/tree/main/examples) folder from llama_index repo near the file.
```python
import os
os.environ["OPENAI_API_KEY"] = 'YOUR_OPENAI_API_KEY'
from llama_index import SimpleDirectoryReader, VectorStoreIndex
from hayloft.llama_index import grab_logs
grab_logs()
documents = SimpleDirectoryReader("examples/paul_graham_essay/data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
query_engine.query("What did the author do growing up?")
```
Or you can start live sessions from hayloft ui, just modify code like here
```python
import os
os.environ["OPENAI_API_KEY"] = 'YOUR_OPENAI_API_KEY'
from llama_index import SimpleDirectoryReader, VectorStoreIndex
from hayloft.llama_index import listen
def agent(query: str):
documents = SimpleDirectoryReader("examples/paul_graham_essay/data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
query_engine.query(query)
listen(agent)
```
Start this script
```
python example.py
```
### BabyAGI
Clone `BabyAGI fork` repo, setup virtual env and install all dependencies
```
git clone git@github.com:eturchenkov/babyagi-hayloft.git && cd babyagi-hayloft
pip install -r requirements.txt
```
Adjust config in .env file and start `babyagi.py` script
```
python babyagi.py
```
Raw data
{
"_id": null,
"home_page": "https://github.com/eturchenkov/hayloft",
"name": "hayloft",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8,<4.0",
"maintainer_email": "",
"keywords": "UI tool,LLM app,tracking,monitoring,llama-index",
"author": "",
"author_email": "",
"download_url": "https://files.pythonhosted.org/packages/32/14/0000dfbf1c049724e8d956831813832dccd4168f5b1003fef036919fe3f5/hayloft-0.4.1.tar.gz",
"platform": null,
"description": "# Hayloft \n\n[![Downloads](https://static.pepy.tech/badge/hayloft/month)](https://pepy.tech/project/hayloft)\n[![](https://dcbadge.vercel.app/api/server/EKewT5cYMy?compact=true&style=flat)](https://discord.gg/EKewT5cYMy)\n[![Twitter](https://img.shields.io/twitter/url/https/twitter.com/eaturchenkov.svg?style=social&label=Follow%20%40eaturchenkov)](https://twitter.com/eaturchenkov)\n\nUI tool for LLM frameworks to make easy prompt/completion tracking, store and comparison of different sessions.\n\n- [Installation](#installation)\n- [Usage](#usage)\n - [LlamaIndex](#llamaindex)\n - [BabyAGI](#babyagi)\n\n## Installation\n\nInstall package with pip\n```\npip install hayloft\n```\n\n## Usage\n\nStart hayloft server\n\n```\nhayloft start\n```\n\nTrace logs of your script on [http://localhost:7000](http://localhost:7000)\n\n### LlamaIndex\n\nInstall `llama-index`, create `example.py` file as below. Put [examples](https://github.com/jerryjliu/llama_index/tree/main/examples) folder from llama_index repo near the file.\n\n```python\nimport os\nos.environ[\"OPENAI_API_KEY\"] = 'YOUR_OPENAI_API_KEY'\n\nfrom llama_index import SimpleDirectoryReader, VectorStoreIndex \nfrom hayloft.llama_index import grab_logs\n\ngrab_logs()\n\ndocuments = SimpleDirectoryReader(\"examples/paul_graham_essay/data\").load_data()\nindex = VectorStoreIndex.from_documents(documents)\nquery_engine = index.as_query_engine()\nquery_engine.query(\"What did the author do growing up?\")\n```\n\nOr you can start live sessions from hayloft ui, just modify code like here\n\n```python\nimport os\nos.environ[\"OPENAI_API_KEY\"] = 'YOUR_OPENAI_API_KEY'\n\nfrom llama_index import SimpleDirectoryReader, VectorStoreIndex \nfrom hayloft.llama_index import listen\n\ndef agent(query: str):\n documents = SimpleDirectoryReader(\"examples/paul_graham_essay/data\").load_data()\n index = VectorStoreIndex.from_documents(documents)\n query_engine = index.as_query_engine()\n query_engine.query(query)\n\nlisten(agent)\n```\n\nStart this script\n\n```\npython example.py\n```\n\n### BabyAGI\n\nClone `BabyAGI fork` repo, setup virtual env and install all dependencies\n\n```\ngit clone git@github.com:eturchenkov/babyagi-hayloft.git && cd babyagi-hayloft\npip install -r requirements.txt\n```\n\nAdjust config in .env file and start `babyagi.py` script\n\n```\npython babyagi.py\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "UI tool for LLM frameworks",
"version": "0.4.1",
"project_urls": {
"Homepage": "https://github.com/eturchenkov/hayloft",
"Repository": "https://github.com/eturchenkov/hayloft"
},
"split_keywords": [
"ui tool",
"llm app",
"tracking",
"monitoring",
"llama-index"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "4dac6738a93edf2722da220d608767d5242bd7915cf1ae4596f9a16b46432622",
"md5": "6c7d917fa88eb3ac4a77c3faf66bc045",
"sha256": "3fb8856f75a6fe6209d8ea7d7a202e05f95ce920f4536e5fb1d6fdfd614445e7"
},
"downloads": -1,
"filename": "hayloft-0.4.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "6c7d917fa88eb3ac4a77c3faf66bc045",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8,<4.0",
"size": 71761,
"upload_time": "2023-07-04T10:10:25",
"upload_time_iso_8601": "2023-07-04T10:10:25.143607Z",
"url": "https://files.pythonhosted.org/packages/4d/ac/6738a93edf2722da220d608767d5242bd7915cf1ae4596f9a16b46432622/hayloft-0.4.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "32140000dfbf1c049724e8d956831813832dccd4168f5b1003fef036919fe3f5",
"md5": "fb85274b02ac2e83bcddc45d0c119660",
"sha256": "6d8449ec3d494e49544fb2700ccbc7b69a3b2ea65c4c158c08e36976f44ecda4"
},
"downloads": -1,
"filename": "hayloft-0.4.1.tar.gz",
"has_sig": false,
"md5_digest": "fb85274b02ac2e83bcddc45d0c119660",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8,<4.0",
"size": 70554,
"upload_time": "2023-07-04T10:10:27",
"upload_time_iso_8601": "2023-07-04T10:10:27.037332Z",
"url": "https://files.pythonhosted.org/packages/32/14/0000dfbf1c049724e8d956831813832dccd4168f5b1003fef036919fe3f5/hayloft-0.4.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-07-04 10:10:27",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "eturchenkov",
"github_project": "hayloft",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "hayloft"
}