smartloop


Namesmartloop JSON
Version 1.1.6 PyPI version JSON
download
home_pagehttps://github.com/LexicHQ/smartloop
SummarySmartloop Command Line interface to process documents using LLM
upload_time2024-09-22 03:03:03
maintainerNone
docs_urlNone
authorSmartloop Inc.
requires_pythonNone
licenseLICENSE.txt
keywords llm framework llama3 phi3 platform document
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
<img src="https://github.com/user-attachments/assets/b30e2c38-dc19-49a5-973d-51e1dafe5c4d" width="100%" />

<br/>

Use the CLI to upload, manage, and query documents based on fine-tuned LLM models. It uses the smartloop API to manage projects and documents and gives you an easy way to quickly process contents and reason based on it.


![PyPI - Version](https://img.shields.io/pypi/v/smartloop)

## Requirements

- Python 3.11

## Installation

Install the CLI with the following command:

```
pip install -U smartloop

```
Once installed, check that everything is setup correctly:

![image](https://github.com/user-attachments/assets/0a4e0221-d2f7-4f87-9fb2-5e4ce7a23f62)


## Setup
First you will need to create a free [account](https://app.smartloop.ai/signup), verify and configure your account. Once verfied, copy your [developer token](https://app.smartloop.ai/developer) to the clipboard. You will need a invitation code as of writing this document, please reach out to us at `hello@smartloop.ai` and we should be able to get you started.

Once you have your token, run the following command in your terminal:

```bash
smartloop login
```

This command will prompt you for your token, copy and pase the token that you have received in your email. Next step it to create a  project, you can do so with the following command:

```bash
smartloop project create --name Lexic
```

To get the project Id , use the following, the will also show you the currently selected project:

```bash
smartloop project list
```

To delete a project, use:

```bash
smartloop project delete --id=project_id
```

## Upload Document

Once the project is created , upload documents from your folder or a specific file, in this case I am uploading the a document describing Microsoft online services form my local machine

```bash
smartloop upload --id=<project_id> --path=~/document1.pdf
```



## Select a project

Use the following command to interactively select a project:


```bash
smartloop project select
```

## Run It

Finally, once project is selected, document you have uploaded and processed, run the CLI to prompt:

```bash
smartloop run
```

This will bring up the prompt to query your information from your uploaded document

```bash
Current project: Microsoft(microsoft-24-07-2024)
Enter message (Ctrl-C to exit): what the SLA for azure open ai
⠋
The SLA (Service Level Agreement) for Azure OpenAI is not explicitly mentioned in the provided text. However, it's possible that the SLA for Azure OpenAI might be similar to the one mentioned below:

"Uptime Percentage"

* Service Credit:
+ < 99.9%: 10%
+ < 99%: 25%
+ < 95%: 100%

Please note that this is not a direct quote from the provided text, but rather an inference based on the format and structure of the SLA mentioned for other Azure services (e.g., SAP HANA on Azure High Availability Pair). To confirm the actual SLA for Azure OpenAI, you should check the official Microsoft documentation or contact their support team.

Enter message (Ctrl-C to exit):
```

In order to set `temperature` of your conversation, which ranges from 0.0 to 1.0, use the following command:

```bash 
smartloop project set --id=project_id --temp=0.3

```

`LLM temperature is a parameter that influences the language model's output, determining whether the output is more random and creative or more predictable.`

The higher value tends towards more creative answer

## Supported Documents types

* PDF
* DOCX
* TXT
* CSV (soon)


## Contributing

Contributions are welcome! Please create a pull request with your changes. 


## Contact

If you have any questions or suggestions, please feel free to reach out to hello@smartloop.ai


## References

* [Smartloop API Documentation](https://api.smartloop.ai/v1/redoc)
* [Meta LLaMA](https://research.facebook.com/publications/llama-open-and-efficient-foundation-language-models/)



## License

This project is licensed under the terms of the MIT license.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/LexicHQ/smartloop",
    "name": "smartloop",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "LLM, framework, llama3, phi3, platform, document",
    "author": "Smartloop Inc.",
    "author_email": "mehfuz@smartloop.ai",
    "download_url": "https://files.pythonhosted.org/packages/9f/41/4fa550085dfd0b1f7b83e6eaaeffc69709238e58237d65a31a20ff504408/smartloop-1.1.6.tar.gz",
    "platform": null,
    "description": "\n<img src=\"https://github.com/user-attachments/assets/b30e2c38-dc19-49a5-973d-51e1dafe5c4d\" width=\"100%\" />\n\n<br/>\n\nUse the CLI to upload, manage, and query documents based on fine-tuned LLM models. It uses the smartloop API to manage projects and documents and gives you an easy way to quickly process contents and reason based on it.\n\n\n![PyPI - Version](https://img.shields.io/pypi/v/smartloop)\n\n## Requirements\n\n- Python 3.11\n\n## Installation\n\nInstall the CLI with the following command:\n\n```\npip install -U smartloop\n\n```\nOnce installed, check that everything is setup correctly:\n\n![image](https://github.com/user-attachments/assets/0a4e0221-d2f7-4f87-9fb2-5e4ce7a23f62)\n\n\n## Setup\nFirst you will need to create a free [account](https://app.smartloop.ai/signup), verify and configure your account. Once verfied, copy your [developer token](https://app.smartloop.ai/developer) to the clipboard. You will need a invitation code as of writing this document, please reach out to us at `hello@smartloop.ai` and we should be able to get you started.\n\nOnce you have your token, run the following command in your terminal:\n\n```bash\nsmartloop login\n```\n\nThis command will prompt you for your token, copy and pase the token that you have received in your email. Next step it to create a  project, you can do so with the following command:\n\n```bash\nsmartloop project create --name Lexic\n```\n\nTo get the project Id , use the following, the will also show you the currently selected project:\n\n```bash\nsmartloop project list\n```\n\nTo delete a project, use:\n\n```bash\nsmartloop project delete --id=project_id\n```\n\n## Upload Document\n\nOnce the project is created , upload documents from your folder or a specific file, in this case I am uploading the a document describing Microsoft online services form my local machine\n\n```bash\nsmartloop upload --id=<project_id> --path=~/document1.pdf\n```\n\n\n\n## Select a project\n\nUse the following command to interactively select a project:\n\n\n```bash\nsmartloop project select\n```\n\n## Run It\n\nFinally, once project is selected, document you have uploaded and processed, run the CLI to prompt:\n\n```bash\nsmartloop run\n```\n\nThis will bring up the prompt to query your information from your uploaded document\n\n```bash\nCurrent project: Microsoft(microsoft-24-07-2024)\nEnter message (Ctrl-C to exit): what the SLA for azure open ai\n\u280b\nThe SLA (Service Level Agreement) for Azure OpenAI is not explicitly mentioned in the provided text. However, it's possible that the SLA for Azure OpenAI might be similar to the one mentioned below:\n\n\"Uptime Percentage\"\n\n* Service Credit:\n+ < 99.9%: 10%\n+ < 99%: 25%\n+ < 95%: 100%\n\nPlease note that this is not a direct quote from the provided text, but rather an inference based on the format and structure of the SLA mentioned for other Azure services (e.g., SAP HANA on Azure High Availability Pair). To confirm the actual SLA for Azure OpenAI, you should check the official Microsoft documentation or contact their support team.\n\nEnter message (Ctrl-C to exit):\n```\n\nIn order to set `temperature` of your conversation, which ranges from 0.0 to 1.0, use the following command:\n\n```bash \nsmartloop project set --id=project_id --temp=0.3\n\n```\n\n`LLM temperature is a parameter that influences the language model's output, determining whether the output is more random and creative or more predictable.`\n\nThe higher value tends towards more creative answer\n\n## Supported Documents types\n\n* PDF\n* DOCX\n* TXT\n* CSV (soon)\n\n\n## Contributing\n\nContributions are welcome! Please create a pull request with your changes. \n\n\n## Contact\n\nIf you have any questions or suggestions, please feel free to reach out to hello@smartloop.ai\n\n\n## References\n\n* [Smartloop API Documentation](https://api.smartloop.ai/v1/redoc)\n* [Meta LLaMA](https://research.facebook.com/publications/llama-open-and-efficient-foundation-language-models/)\n\n\n\n## License\n\nThis project is licensed under the terms of the MIT license.\n",
    "bugtrack_url": null,
    "license": "LICENSE.txt",
    "summary": "Smartloop Command Line interface to process documents using LLM",
    "version": "1.1.6",
    "project_urls": {
        "Homepage": "https://github.com/LexicHQ/smartloop"
    },
    "split_keywords": [
        "llm",
        " framework",
        " llama3",
        " phi3",
        " platform",
        " document"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "75bc664361532f468ec36f8da5255c67d891293c0732fda404199962ea0dda3a",
                "md5": "b5d8b64b01ad9c0790191f3051979c65",
                "sha256": "881f8ccc6de5222249229276203bba9eb534107b7be2e12134b3af5c25ef5da8"
            },
            "downloads": -1,
            "filename": "smartloop-1.1.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b5d8b64b01ad9c0790191f3051979c65",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 11475,
            "upload_time": "2024-09-22T03:03:01",
            "upload_time_iso_8601": "2024-09-22T03:03:01.105703Z",
            "url": "https://files.pythonhosted.org/packages/75/bc/664361532f468ec36f8da5255c67d891293c0732fda404199962ea0dda3a/smartloop-1.1.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9f414fa550085dfd0b1f7b83e6eaaeffc69709238e58237d65a31a20ff504408",
                "md5": "2b98b65fc261ae079f08bbc569b6941f",
                "sha256": "9b1babb701f43bd4ef15f3ee9d75488eadf8733166feb424455c8cd722265e3c"
            },
            "downloads": -1,
            "filename": "smartloop-1.1.6.tar.gz",
            "has_sig": false,
            "md5_digest": "2b98b65fc261ae079f08bbc569b6941f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 11082,
            "upload_time": "2024-09-22T03:03:03",
            "upload_time_iso_8601": "2024-09-22T03:03:03.680638Z",
            "url": "https://files.pythonhosted.org/packages/9f/41/4fa550085dfd0b1f7b83e6eaaeffc69709238e58237d65a31a20ff504408/smartloop-1.1.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-22 03:03:03",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "LexicHQ",
    "github_project": "smartloop",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "smartloop"
}
        
Elapsed time: 0.43313s