superu


Namesuperu JSON
Version 0.0.8 PyPI version JSON
download
home_pageNone
SummaryA package that helps optimize your chatbot response
upload_time2024-04-30 05:31:09
maintainerNone
docs_urlNone
authorSuperU
requires_pythonNone
licenseNone
keywords python chatbot ecommerce tag generation semantic search openai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
# superU Python Package



## Intent Classification for LLM



This functionality is designed to help you build and test models for intent classification for various user queries. Below, you will find the list of defined classes and how to use the functionality. 



### Intent categories on user queries:



- Informational

- Navigational

- Transactional

- Commercial



### Additional tags:



- Human Support (Requested for human support)

- Support (Looking for help)

- FAQ

- Language: {English, Hindi, Mandarin}





## User Persona



This functionality is designed to identify the persona of a user based on their conversations or statements. By analyzing the text, it extracts and identifies key aspects of a user's persona, such as age, gender, profession, hobbies, relationship status, and city.



### Requirements for building user persona with open source LLaMA3

To inference LLaMA3 on CPU it is necessary to have completed the setup of Ollama.

1. Ensure you have installed docker on your desktop. You can install it from https://docs.docker.com/engine/install/

2. Run the below command on your terminal to clone the ollma image on your docker container

```bash

docker pull ollama/ollama

```

3. Run the docker container using the below command

```bash

docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

```

4. Go to Exec/Terminal in docker and run the command to install the LLaMA3 model

```bash

ollama run llama3

```

5. Once you have completed the above steps, you can successfully build user persona using open source LLaMA3 via Ollama



### Key data points considered

- Age

- Gender

- City

- Profession

- Relationship Status

- Interests

- Contact Info





## LLM Analytics



This functionality allows you to track metrics (cost, latency, quality) about the usage of your LLM in the chatbot and gain key insights about your users.





### Get your API Credentials for LLLM Analytics



To start using the superU's Free Analytics API service, follow these steps:



1. Visit analytics.superu.ai.

2. Sign up for a free account or log in if you already have one.

3. Create a new Project.

4. Navigate to the Settings > API keys section.

5. Generate your superU API keys.





## Usage



### Installing the package



```bash

pip install superu

```



### Example Usage

```python

import openai

import superu



openai.api_type = ""

openai.api_key = ""

openai.azure_endpoint = ""

openai.api_version = ""

User_Persona_1 = superu.superU.Build_User_Persona_OpenAI(openai, llm_deploymentname='')

User_Persona_2 = superu.superU.Build_User_Persona_LLaMA3()

Intent_Classification_1 = superu.superU.Intent_Classification()



prompt = "What is 1 + 1"



intent = Intent_Classification_1.get_intent(prompt)

user_persona_openai = User_Persona_1.build([prompt])

user_persona_llama3 = User_Persona_2.build([prompt])





messages = [{"role": "system", "content":"You are a helpful assistant."}]

message_r = {"role": "user", "content": f"{prompt}"}

messages.append(message_r)

response = openai.chat.completions.create(model="", messages=messages)



data = {

    "input_messages": messages,                                         # Required - Input Messages 

    "output_messages": response.choices[0].message.content,      # Required - the output from the model

    "metadata": {"user": "test-user", "context": "openai testing"},     # Optional - to give some metadata to the conversation

    "model": response.model,                                     # Required - Name of the model

    "user_id": "pip package test",                                                      # Optional - if not given a user_id will be generated

    "usage": response.usage.model_dump(),                        # Optional - usage details to track the model usage and costs

    "name": "test"                                                          # Optional - to name the given conversation 

}



service_client = superu.superU.LLM_Analytics(public_key="", secret_key="")



service_client.analyse(data)



print("Intent: ", intent)

print("User Details OpenAI: ", user_persona_openai)

print("User Details LLaMA3: ", user_persona_llama3)



```





## Contributing



We welcome contributions to this project! If you have suggestions for improvements or bug fixes, feel free to open an issue or submit a pull request.


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "superu",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "python, chatbot, ecommerce, tag generation, semantic search, openai",
    "author": "SuperU",
    "author_email": "<adi@superu.ai>",
    "download_url": "https://files.pythonhosted.org/packages/a9/3b/f85be45d530f733e46b6e04fe618e15ba9a990d78dd7443d5d15d29ddc08/superu-0.0.8.tar.gz",
    "platform": null,
    "description": "\r\n# superU Python Package\r\n\r\n\r\n\r\n## Intent Classification for LLM\r\n\r\n\r\n\r\nThis functionality is designed to help you build and test models for intent classification for various user queries. Below, you will find the list of defined classes and how to use the functionality. \r\n\r\n\r\n\r\n### Intent categories on user queries:\r\n\r\n\r\n\r\n- Informational\r\n\r\n- Navigational\r\n\r\n- Transactional\r\n\r\n- Commercial\r\n\r\n\r\n\r\n### Additional tags:\r\n\r\n\r\n\r\n- Human Support (Requested for human support)\r\n\r\n- Support (Looking for help)\r\n\r\n- FAQ\r\n\r\n- Language: {English, Hindi, Mandarin}\r\n\r\n\r\n\r\n\r\n\r\n## User Persona\r\n\r\n\r\n\r\nThis functionality is designed to identify the persona of a user based on their conversations or statements. By analyzing the text, it extracts and identifies key aspects of a user's persona, such as age, gender, profession, hobbies, relationship status, and city.\r\n\r\n\r\n\r\n### Requirements for building user persona with open source LLaMA3\r\n\r\nTo inference LLaMA3 on CPU it is necessary to have completed the setup of Ollama.\r\n\r\n1. Ensure you have installed docker on your desktop. You can install it from https://docs.docker.com/engine/install/\r\n\r\n2. Run the below command on your terminal to clone the ollma image on your docker container\r\n\r\n```bash\r\n\r\ndocker pull ollama/ollama\r\n\r\n```\r\n\r\n3. Run the docker container using the below command\r\n\r\n```bash\r\n\r\ndocker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama\r\n\r\n```\r\n\r\n4. Go to Exec/Terminal in docker and run the command to install the LLaMA3 model\r\n\r\n```bash\r\n\r\nollama run llama3\r\n\r\n```\r\n\r\n5. Once you have completed the above steps, you can successfully build user persona using open source LLaMA3 via Ollama\r\n\r\n\r\n\r\n### Key data points considered\r\n\r\n- Age\r\n\r\n- Gender\r\n\r\n- City\r\n\r\n- Profession\r\n\r\n- Relationship Status\r\n\r\n- Interests\r\n\r\n- Contact Info\r\n\r\n\r\n\r\n\r\n\r\n## LLM Analytics\r\n\r\n\r\n\r\nThis functionality allows you to track metrics (cost, latency, quality) about the usage of your LLM in the chatbot and gain key insights about your users.\r\n\r\n\r\n\r\n\r\n\r\n### Get your API Credentials for LLLM Analytics\r\n\r\n\r\n\r\nTo start using the superU's Free Analytics API service, follow these steps:\r\n\r\n\r\n\r\n1. Visit analytics.superu.ai.\r\n\r\n2. Sign up for a free account or log in if you already have one.\r\n\r\n3. Create a new Project.\r\n\r\n4. Navigate to the Settings > API keys section.\r\n\r\n5. Generate your superU API keys.\r\n\r\n\r\n\r\n\r\n\r\n## Usage\r\n\r\n\r\n\r\n### Installing the package\r\n\r\n\r\n\r\n```bash\r\n\r\npip install superu\r\n\r\n```\r\n\r\n\r\n\r\n### Example Usage\r\n\r\n```python\r\n\r\nimport openai\r\n\r\nimport superu\r\n\r\n\r\n\r\nopenai.api_type = \"\"\r\n\r\nopenai.api_key = \"\"\r\n\r\nopenai.azure_endpoint = \"\"\r\n\r\nopenai.api_version = \"\"\r\n\r\nUser_Persona_1 = superu.superU.Build_User_Persona_OpenAI(openai, llm_deploymentname='')\r\n\r\nUser_Persona_2 = superu.superU.Build_User_Persona_LLaMA3()\r\n\r\nIntent_Classification_1 = superu.superU.Intent_Classification()\r\n\r\n\r\n\r\nprompt = \"What is 1 + 1\"\r\n\r\n\r\n\r\nintent = Intent_Classification_1.get_intent(prompt)\r\n\r\nuser_persona_openai = User_Persona_1.build([prompt])\r\n\r\nuser_persona_llama3 = User_Persona_2.build([prompt])\r\n\r\n\r\n\r\n\r\n\r\nmessages = [{\"role\": \"system\", \"content\":\"You are a helpful assistant.\"}]\r\n\r\nmessage_r = {\"role\": \"user\", \"content\": f\"{prompt}\"}\r\n\r\nmessages.append(message_r)\r\n\r\nresponse = openai.chat.completions.create(model=\"\", messages=messages)\r\n\r\n\r\n\r\ndata = {\r\n\r\n    \"input_messages\": messages,                                         # Required - Input Messages \r\n\r\n    \"output_messages\": response.choices[0].message.content,      # Required - the output from the model\r\n\r\n    \"metadata\": {\"user\": \"test-user\", \"context\": \"openai testing\"},     # Optional - to give some metadata to the conversation\r\n\r\n    \"model\": response.model,                                     # Required - Name of the model\r\n\r\n    \"user_id\": \"pip package test\",                                                      # Optional - if not given a user_id will be generated\r\n\r\n    \"usage\": response.usage.model_dump(),                        # Optional - usage details to track the model usage and costs\r\n\r\n    \"name\": \"test\"                                                          # Optional - to name the given conversation \r\n\r\n}\r\n\r\n\r\n\r\nservice_client = superu.superU.LLM_Analytics(public_key=\"\", secret_key=\"\")\r\n\r\n\r\n\r\nservice_client.analyse(data)\r\n\r\n\r\n\r\nprint(\"Intent: \", intent)\r\n\r\nprint(\"User Details OpenAI: \", user_persona_openai)\r\n\r\nprint(\"User Details LLaMA3: \", user_persona_llama3)\r\n\r\n\r\n\r\n```\r\n\r\n\r\n\r\n\r\n\r\n## Contributing\r\n\r\n\r\n\r\nWe welcome contributions to this project! If you have suggestions for improvements or bug fixes, feel free to open an issue or submit a pull request.\r\n\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A package that helps optimize your chatbot response",
    "version": "0.0.8",
    "project_urls": null,
    "split_keywords": [
        "python",
        " chatbot",
        " ecommerce",
        " tag generation",
        " semantic search",
        " openai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "54f9a0ed367dddc4f3985c5cd4cdb54b86c74339d76fe5a800d615f97790a0a1",
                "md5": "8e2126f93d5268816d39ad917ce6c81d",
                "sha256": "a028dc8c61187278c719aaa9810c5236485d1e2b60b6f919152d652e8bd00508"
            },
            "downloads": -1,
            "filename": "superu-0.0.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8e2126f93d5268816d39ad917ce6c81d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 12185,
            "upload_time": "2024-04-30T05:31:07",
            "upload_time_iso_8601": "2024-04-30T05:31:07.622356Z",
            "url": "https://files.pythonhosted.org/packages/54/f9/a0ed367dddc4f3985c5cd4cdb54b86c74339d76fe5a800d615f97790a0a1/superu-0.0.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a93bf85be45d530f733e46b6e04fe618e15ba9a990d78dd7443d5d15d29ddc08",
                "md5": "9c5eee6f183b1152a24ae05b456b1b81",
                "sha256": "9b316e91bf5472f4c7a0efc2273e9493cdd078891949f793f290c9b3c4795ade"
            },
            "downloads": -1,
            "filename": "superu-0.0.8.tar.gz",
            "has_sig": false,
            "md5_digest": "9c5eee6f183b1152a24ae05b456b1b81",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 12468,
            "upload_time": "2024-04-30T05:31:09",
            "upload_time_iso_8601": "2024-04-30T05:31:09.408669Z",
            "url": "https://files.pythonhosted.org/packages/a9/3b/f85be45d530f733e46b6e04fe618e15ba9a990d78dd7443d5d15d29ddc08/superu-0.0.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-30 05:31:09",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "superu"
}
        
Elapsed time: 0.23467s