<img src="https://github.com/user-attachments/assets/b30e2c38-dc19-49a5-973d-51e1dafe5c4d" width="100%" />
<br/>
Use the CLI to upload, manage, and query documents based on fine-tuned LLM models. It uses the smartloop API to manage projects and documents and gives you an easy way to quickly process contents and reason based on it.

## Requirements
- Python 3.11
## Installation
Install the CLI with the following command:
```
pip install -U smartloop
```
Once installed, check that everything is setup correctly:
```console
smartloop --help
Usage: smartloop [OPTIONS] COMMAND [ARGS]...
╭─ Options ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ --install-completion Install completion for the current shell. │
│ --show-completion Show completion for the current shell, to copy it or customize the installation. │
│ --help Show this message and exit. │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Commands ────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ agent Manage agent(s) │
│ login Authenticate using a token from https://api.smartloop.ai/v1/redoc │
│ run Starts a chat session with a selected agent │
│ upload Upload document for the selected agent │
│ version Version of the cli │
│ whoami Find out which account you are logged in │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
```
## Setup
First you will need to create a free [account](https://agent.smartloop.ai/signup), verify and configure your account.
Once verified, copy your [developer token](https://agent.smartloop.ai/developer) to the clipboard. If you have any problem setting up your account please reach out to us at `hello@smartloop.ai` and we should be able to get you started.
Once you have your token, run the following command in your terminal:
```bash
smartloop login
```
## Create an Agent
Once you have configured the CLI , you can start creating agent using the following command:
```bash
smartloop agent create --name microsoft
```
## Select an Agent
Use the following command to interactively select an agent:
```bash
smartloop agent select
```
## Upload Document
Once the agent is selected , upload documents from your folder or a specific file to personalized your agent, in this case I am uploading the a document describing Microsoft online services form my local machine:
```bash
smartloop upload --path=~/document1.pdf
```
## Run It
Execute the following command to start prompting:
```bash
smartloop run
```
This will bring up the interface to prompt your queries as shown below:
```bash
Microsoft(microsoft-24-07-2024)
======================================
Enter prompt (Ctrl-C to exit):
what the SLA for azure open ai
⠋
The SLA (Service Level Agreement) for Azure OpenAI is not explicitly mentioned in the provided text. However, it's possible that the SLA for Azure OpenAI might be similar to the one mentioned below:
"Uptime Percentage"
* Service Credit:
+ < 99.9%: 10%
+ < 99%: 25%
+ < 95%: 100%
Please note that this is not a direct quote from the provided text, but rather an inference based on the format and structure of the SLA mentioned for other Azure services (e.g., SAP HANA on Azure High Availability Pair). To confirm the actual SLA for Azure OpenAI, you should check the official Microsoft documentation or contact their support team.
Prompt message (Ctrl-C to exit):
```
In order to set `temperature` of your conversation, which ranges from 0.0 to 1.0, use the following command:
```bash
smartloop agent set --id=project_id --temp=0.3
```
To enable memory to retain context in the conversation, use the following command:
```bash
smartloop agent set --id=project_id --memory
```
To disable memory, use the following command:
```bash
smartloop agent set --id=project_id --no-memory
```
`LLM temperature is a parameter that influences the language model's output, determining whether the output is more random and creative or more predictable.`
The higher value tends towards more creative answer
## Supported Documents types
* PDF
* DOCX
* TXT
* CSV
## Contributing
Contributions are welcome! Please create a pull request with your changes.
## Contact
If you have any questions or suggestions, please feel free to reach out to hello@smartloop.ai
## References
* [Smartloop API Documentation](https://api.smartloop.ai/v1/redoc)
* [Meta LLaMA](https://research.facebook.com/publications/llama-open-and-efficient-foundation-language-models/)
* [LoRA](https://arxiv.org/abs/2106.09685)
## License
This project is licensed under the terms of the MIT license.
Raw data
{
"_id": null,
"home_page": "https://github.com/LexicHQ/smartloop",
"name": "smartloop",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "LLM, framework, llama3, phi3, platform, document",
"author": "Smartloop Inc.",
"author_email": "mehfuz@smartloop.ai",
"download_url": "https://files.pythonhosted.org/packages/f4/bc/747dbe4c900fd350d4101c187e6795bd696ab4f29332c17dd1413e9af542/smartloop-1.2.3.tar.gz",
"platform": null,
"description": "\n<img src=\"https://github.com/user-attachments/assets/b30e2c38-dc19-49a5-973d-51e1dafe5c4d\" width=\"100%\" />\n\n<br/>\n\nUse the CLI to upload, manage, and query documents based on fine-tuned LLM models. It uses the smartloop API to manage projects and documents and gives you an easy way to quickly process contents and reason based on it.\n\n\n\n\n## Requirements\n\n- Python 3.11\n\n## Installation\n\nInstall the CLI with the following command:\n\n```\npip install -U smartloop\n\n```\nOnce installed, check that everything is setup correctly:\n\n\n\n```console\nsmartloop --help\n \n Usage: smartloop [OPTIONS] COMMAND [ARGS]... \n \n\u256d\u2500 Options \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 --install-completion Install completion for the current shell. \u2502\n\u2502 --show-completion Show completion for the current shell, to copy it or customize the installation. \u2502\n\u2502 --help Show this message and exit. \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\u256d\u2500 Commands \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 agent Manage agent(s) \u2502\n\u2502 login Authenticate using a token from https://api.smartloop.ai/v1/redoc \u2502\n\u2502 run Starts a chat session with a selected agent \u2502\n\u2502 upload Upload document for the selected agent \u2502\n\u2502 version Version of the cli \u2502\n\u2502 whoami Find out which account you are logged in \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\n\n```\n\n## Setup\nFirst you will need to create a free [account](https://agent.smartloop.ai/signup), verify and configure your account. \nOnce verified, copy your [developer token](https://agent.smartloop.ai/developer) to the clipboard. If you have any problem setting up your account please reach out to us at `hello@smartloop.ai` and we should be able to get you started.\n\nOnce you have your token, run the following command in your terminal:\n\n```bash\nsmartloop login\n```\n\n## Create an Agent\n\nOnce you have configured the CLI , you can start creating agent using the following command:\n\n```bash\nsmartloop agent create --name microsoft\n```\n\n## Select an Agent\n\nUse the following command to interactively select an agent:\n\n\n```bash\nsmartloop agent select\n```\n\n## Upload Document\n\nOnce the agent is selected , upload documents from your folder or a specific file to personalized your agent, in this case I am uploading the a document describing Microsoft online services form my local machine:\n\n```bash\nsmartloop upload --path=~/document1.pdf\n```\n\n## Run It\n\nExecute the following command to start prompting:\n\n```bash\nsmartloop run\n```\n\nThis will bring up the interface to prompt your queries as shown below:\n\n```bash\nMicrosoft(microsoft-24-07-2024)\n======================================\nEnter prompt (Ctrl-C to exit): \nwhat the SLA for azure open ai\n\u280b\nThe SLA (Service Level Agreement) for Azure OpenAI is not explicitly mentioned in the provided text. However, it's possible that the SLA for Azure OpenAI might be similar to the one mentioned below:\n\n\"Uptime Percentage\"\n\n* Service Credit:\n+ < 99.9%: 10%\n+ < 99%: 25%\n+ < 95%: 100%\n\nPlease note that this is not a direct quote from the provided text, but rather an inference based on the format and structure of the SLA mentioned for other Azure services (e.g., SAP HANA on Azure High Availability Pair). To confirm the actual SLA for Azure OpenAI, you should check the official Microsoft documentation or contact their support team.\n\nPrompt message (Ctrl-C to exit):\n```\n\nIn order to set `temperature` of your conversation, which ranges from 0.0 to 1.0, use the following command:\n\n```bash \nsmartloop agent set --id=project_id --temp=0.3\n\n```\n\nTo enable memory to retain context in the conversation, use the following command:\n\n```bash \nsmartloop agent set --id=project_id --memory\n\n```\n\nTo disable memory, use the following command:\n\n```bash \nsmartloop agent set --id=project_id --no-memory\n\n```\n\n`LLM temperature is a parameter that influences the language model's output, determining whether the output is more random and creative or more predictable.`\n\nThe higher value tends towards more creative answer\n\n\n## Supported Documents types\n\n* PDF\n* DOCX\n* TXT\n* CSV\n\n\n## Contributing\n\nContributions are welcome! Please create a pull request with your changes. \n\n\n## Contact\n\nIf you have any questions or suggestions, please feel free to reach out to hello@smartloop.ai\n\n\n## References\n\n* [Smartloop API Documentation](https://api.smartloop.ai/v1/redoc)\n* [Meta LLaMA](https://research.facebook.com/publications/llama-open-and-efficient-foundation-language-models/)\n* [LoRA](https://arxiv.org/abs/2106.09685)\n\n\n\n## License\n\nThis project is licensed under the terms of the MIT license.\n",
"bugtrack_url": null,
"license": "LICENSE.txt",
"summary": "Smartloop Command Line interface to process documents using LLM",
"version": "1.2.3",
"project_urls": {
"Homepage": "https://github.com/LexicHQ/smartloop"
},
"split_keywords": [
"llm",
" framework",
" llama3",
" phi3",
" platform",
" document"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "64e338666404707d58aa60d29bb0878870b30b5830a6fad453526028c42a60d1",
"md5": "07eb96d924c1461a9a7ebf79765ec88f",
"sha256": "527e9d1c1ae1b5a45ce5be561fb817df91049c91b750174854cf79474f52ebd9"
},
"downloads": -1,
"filename": "smartloop-1.2.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "07eb96d924c1461a9a7ebf79765ec88f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 12129,
"upload_time": "2025-02-09T03:55:49",
"upload_time_iso_8601": "2025-02-09T03:55:49.891240Z",
"url": "https://files.pythonhosted.org/packages/64/e3/38666404707d58aa60d29bb0878870b30b5830a6fad453526028c42a60d1/smartloop-1.2.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "f4bc747dbe4c900fd350d4101c187e6795bd696ab4f29332c17dd1413e9af542",
"md5": "d4ea17ce9ccc8fa374eab6638dfe5f1d",
"sha256": "5b8f33fa683379217d76567b693fd4b27ec82ebecfead3f25e1663f56ca84cf1"
},
"downloads": -1,
"filename": "smartloop-1.2.3.tar.gz",
"has_sig": false,
"md5_digest": "d4ea17ce9ccc8fa374eab6638dfe5f1d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 11902,
"upload_time": "2025-02-09T03:55:57",
"upload_time_iso_8601": "2025-02-09T03:55:57.192259Z",
"url": "https://files.pythonhosted.org/packages/f4/bc/747dbe4c900fd350d4101c187e6795bd696ab4f29332c17dd1413e9af542/smartloop-1.2.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-09 03:55:57",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "LexicHQ",
"github_project": "smartloop",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "PyYAML",
"specs": [
[
"==",
"6.0.1"
]
]
},
{
"name": "requests",
"specs": [
[
"==",
"2.32.3"
]
]
},
{
"name": "typer",
"specs": [
[
"==",
"0.12.3"
]
]
},
{
"name": "art",
"specs": [
[
"==",
"6.2"
]
]
},
{
"name": "inquirer",
"specs": [
[
"==",
"3.3.0"
]
]
},
{
"name": "tabulate",
"specs": [
[
"==",
"0.9.0"
]
]
}
],
"lcname": "smartloop"
}