# Prompt-AI Library
**Prompt-AI** is a powerful library designed to optimize AI-driven prompt handling and response generation using the Gemini API. By introducing structured database management and efficient embedding retrieval, Prompt-AI significantly enhances performance, reduces response times, and provides a seamless solution for integrating AI models into various applications.
## Key Features
- **Efficient Embedding Management**: Prompt-AI stores pre-generated embeddings in a structured database, significantly reducing computational overhead and improving response times.
- **Real-Time Updates**: Manage datasets and dataframes efficiently, ensuring that embeddings are generated once and reused across multiple sessions.
- **Performance and Scalability**: The streamlined approach enhances performance and scalability, making Prompt-AI ideal for chatbots, recommendation systems, and other AI-powered tools.
- **Versatile Integration**: Seamlessly integrates with Node.js endpoint servers, bridging different technologies and workflows.
## Installation
To install Prompt-AI, use pip:
```bash
pip install prompt-ai
```
To upgrade to latest version:
```bash
pip install prompt-ai --upgrade
```
---
## After Installation follow these steps to use promp-ai
### 1. Generate an API Key
To begin, you’ll need to generate an API key. Follow the link below to generate your API key:
[Generate API Key](https://aistudio.google.com/app/apikey?_gl=1*1ohn5hn*_ga*MTc3OTQxNzg5OC4xNzIyNDE2MDUx*_ga_P1DBVKWT6V*MTcyMzM3NTkzOS4xMi4xLjE3MjMzNzYxODUuNTYuMC4xMzQ0NjE1MTM2)
#### Brief Summary of Gemini Model
The Gemini model is a powerful AI-driven model designed for generating contextually relevant responses to user prompts. Unlike traditional approaches where embeddings are generated on each run, Prompt-AI integrates a more efficient workflow by storing pre-generated embeddings in a NoSQL database. This allows for faster response times and reduces computational overhead, making it ideal for applications like chatbots, recommendation systems, and other AI-powered tools.
### 3. Setting up MongoDB (In later versions: SQL and Cloud database will be added)
1. **Create a Database in Mongo Atlas or MongoDB Compass (Which you feel good).**
2. **Create collection and Documents.**
3. **Set the document in this structure:**
```doctest
{
"id": 1,
"title": "Gork vs Chat-gpt",
"text": "In the rapidly evolving landscape of artificial inte...",
}
```
### 4. Using Prompt-AI to manage prompts and generate response
**Prompt-AI** provides two core functions to help you manage prompts and generate responses:
#### 1) `configure(mongo_uri: string, db_name: string, collection_name: string, columns: array, API_KEY: string, embeddings: bool)`
```doctest
configure(mongo_uri, db_name, collection_name, columns, API_KEY, embeddings)
```
This function configures the connection to your MongoDB Atlas and sets up the necessary parameters for generating embeddings.
- **mongo_uri**: `string`
This should contain your MongoDB Atlas connection string.
```doctest
mongo_uri = 'mongoDB connection string'
```
- **db_name**: `string`
The name of your MongoDB database.
```doctest
db_name = 'Database Name'
```
- **collection_name**: `string`
The name of the collection in your database where the data is stored.
```doctest
collection_name = 'collection name'
```
- **columns**: `array`
An array of strings, each representing a field name present in each document of the collection. The field which contains <b>ANSWER</b> data must be named with ```'text'```.
```doctest
columns = ['id', 'title', 'text']
```
- **API_KEY**: `string`
The API key generated in the first step.
```doctest
API_KEY = 'key generated in first step'
```
- **embeddings**: `bool`
A boolean flag indicating whether embeddings need to be created (`true`) or if they already exist (`false`).
```doctest
embeddings = True or False
```
This function call will return datasets in form of tabular dataframe.
```doctest
id title text embeddings
1 "chat-gpt features" "chat-gpt has..." [4.0322, 2.3344, 1.09...]
2 "Gork vs Chat-gpt" "Gork have plenty..." [1.1702, 0.4184, 5.19...]
```
overall <b>configure</b> function will look like this...
```pycon
uri = 'your connection string'
db = 'database_name'
col = 'collection_name'
API_KEY = 'api_key generated in first step'
column = ['id', 'title', 'text']
embeddings = True
dataframe = configure(uri, db, col, API_KEY, column, embeddings)
```
#### 2) `generate(user_prompt, dataframe)`
```doctest
generate(user_prompt, dataframe)
```
This function processes the user’s prompt, interacts with the database, and returns the AI-generated response.
<b>dataframe</b> will be used here inside generate() function.
```doctest
@app.post("/api")
async def generate_response(request: PromptRequest):
# Extract the prompt from the request body
user_prompt = request.prompt
# calling generate() function with prompt and dataframe as parameter
response = generate(user_prompt, df)
return {"response": response}
```
Now on making post request to this endpoint keep `prompt: string` inside body of the post request in `JSON` format.
And for handling response you can have it in this way:
```doctest
const response = await axios.post('endpoint_url/api', {prompt});
res = response.data.response;
```
Here I have used `axios`, but you can also use `fetch` api to make post request and fetch the response in the same way.
---
With **Prompt-AI**, you can efficiently manage AI-driven prompt handling, leveraging the Gemini model's capabilities with enhanced performance and scalability. Whether you’re building a chatbot, a recommendation system, or any other AI-powered application, Prompt-AI provides a streamlined and powerful solution.
## Happy Coding :)
---
Raw data
{
"_id": null,
"home_page": "https://github.com/Gaurav-Codetek/chatflow.git",
"name": "prompt-ai",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": null,
"keywords": "python, chatbot, genai, generative ai, gemini api, chat ai",
"author": "Gaurav Patel (CodersTek)",
"author_email": "gauravpatel29@outlook.in",
"download_url": "https://files.pythonhosted.org/packages/b8/48/2649fe5b1c49ba48e278c92ec9ed13c76b81cd6d591a827049b0b40136b4/prompt-ai-0.2.0.tar.gz",
"platform": null,
"description": "# Prompt-AI Library\r\n\r\n**Prompt-AI** is a powerful library designed to optimize AI-driven prompt handling and response generation using the Gemini API. By introducing structured database management and efficient embedding retrieval, Prompt-AI significantly enhances performance, reduces response times, and provides a seamless solution for integrating AI models into various applications.\r\n\r\n## Key Features\r\n\r\n- **Efficient Embedding Management**: Prompt-AI stores pre-generated embeddings in a structured database, significantly reducing computational overhead and improving response times.\r\n- **Real-Time Updates**: Manage datasets and dataframes efficiently, ensuring that embeddings are generated once and reused across multiple sessions.\r\n- **Performance and Scalability**: The streamlined approach enhances performance and scalability, making Prompt-AI ideal for chatbots, recommendation systems, and other AI-powered tools.\r\n- **Versatile Integration**: Seamlessly integrates with Node.js endpoint servers, bridging different technologies and workflows.\r\n\r\n## Installation\r\n\r\nTo install Prompt-AI, use pip:\r\n\r\n```bash\r\npip install prompt-ai\r\n```\r\nTo upgrade to latest version: \r\n```bash\r\npip install prompt-ai --upgrade\r\n```\r\n---\r\n## After Installation follow these steps to use promp-ai\r\n### 1. Generate an API Key\r\nTo begin, you\u00e2\u20ac\u2122ll need to generate an API key. Follow the link below to generate your API key:\r\n\r\n[Generate API Key](https://aistudio.google.com/app/apikey?_gl=1*1ohn5hn*_ga*MTc3OTQxNzg5OC4xNzIyNDE2MDUx*_ga_P1DBVKWT6V*MTcyMzM3NTkzOS4xMi4xLjE3MjMzNzYxODUuNTYuMC4xMzQ0NjE1MTM2)\r\n\r\n#### Brief Summary of Gemini Model\r\nThe Gemini model is a powerful AI-driven model designed for generating contextually relevant responses to user prompts. Unlike traditional approaches where embeddings are generated on each run, Prompt-AI integrates a more efficient workflow by storing pre-generated embeddings in a NoSQL database. This allows for faster response times and reduces computational overhead, making it ideal for applications like chatbots, recommendation systems, and other AI-powered tools.\r\n\r\n### 3. Setting up MongoDB (In later versions: SQL and Cloud database will be added)\r\n1. **Create a Database in Mongo Atlas or MongoDB Compass (Which you feel good).**\r\n2. **Create collection and Documents.**\r\n3. **Set the document in this structure:**\r\n```doctest\r\n{\r\n\"id\": 1,\r\n\"title\": \"Gork vs Chat-gpt\",\r\n\"text\": \"In the rapidly evolving landscape of artificial inte...\",\r\n}\r\n```\r\n### 4. Using Prompt-AI to manage prompts and generate response\r\n\r\n**Prompt-AI** provides two core functions to help you manage prompts and generate responses:\r\n\r\n#### 1) `configure(mongo_uri: string, db_name: string, collection_name: string, columns: array, API_KEY: string, embeddings: bool)`\r\n```doctest\r\nconfigure(mongo_uri, db_name, collection_name, columns, API_KEY, embeddings)\r\n```\r\nThis function configures the connection to your MongoDB Atlas and sets up the necessary parameters for generating embeddings.\r\n\r\n- **mongo_uri**: `string` \r\n This should contain your MongoDB Atlas connection string.\r\n```doctest\r\nmongo_uri = 'mongoDB connection string'\r\n```\r\n\r\n- **db_name**: `string` \r\n The name of your MongoDB database.\r\n```doctest\r\ndb_name = 'Database Name'\r\n```\r\n\r\n- **collection_name**: `string` \r\n The name of the collection in your database where the data is stored.\r\n```doctest\r\ncollection_name = 'collection name'\r\n```\r\n\r\n- **columns**: `array` \r\n An array of strings, each representing a field name present in each document of the collection. The field which contains <b>ANSWER</b> data must be named with ```'text'```.\r\n```doctest\r\ncolumns = ['id', 'title', 'text']\r\n```\r\n- **API_KEY**: `string` \r\n The API key generated in the first step.\r\n```doctest\r\nAPI_KEY = 'key generated in first step'\r\n```\r\n\r\n- **embeddings**: `bool` \r\n A boolean flag indicating whether embeddings need to be created (`true`) or if they already exist (`false`).\r\n```doctest\r\nembeddings = True or False\r\n```\r\nThis function call will return datasets in form of tabular dataframe.\r\n```doctest\r\nid title text embeddings\r\n1 \"chat-gpt features\" \"chat-gpt has...\" [4.0322, 2.3344, 1.09...]\r\n2 \"Gork vs Chat-gpt\" \"Gork have plenty...\" [1.1702, 0.4184, 5.19...]\r\n```\r\noverall <b>configure</b> function will look like this... \r\n```pycon\r\nuri = 'your connection string' \r\ndb = 'database_name'\r\ncol = 'collection_name'\r\nAPI_KEY = 'api_key generated in first step'\r\ncolumn = ['id', 'title', 'text']\r\nembeddings = True\r\ndataframe = configure(uri, db, col, API_KEY, column, embeddings)\r\n```\r\n#### 2) `generate(user_prompt, dataframe)`\r\n```doctest\r\ngenerate(user_prompt, dataframe)\r\n```\r\nThis function processes the user\u00e2\u20ac\u2122s prompt, interacts with the database, and returns the AI-generated response.\r\n<b>dataframe</b> will be used here inside generate() function.\r\n```doctest\r\n@app.post(\"/api\")\r\nasync def generate_response(request: PromptRequest):\r\n # Extract the prompt from the request body\r\n user_prompt = request.prompt\r\n\r\n # calling generate() function with prompt and dataframe as parameter\r\n\r\n response = generate(user_prompt, df)\r\n\r\n return {\"response\": response}\r\n```\r\nNow on making post request to this endpoint keep `prompt: string` inside body of the post request in `JSON` format.\r\nAnd for handling response you can have it in this way: \r\n```doctest\r\nconst response = await axios.post('endpoint_url/api', {prompt});\r\nres = response.data.response;\r\n```\r\nHere I have used `axios`, but you can also use `fetch` api to make post request and fetch the response in the same way.\r\n\r\n---\r\n\r\nWith **Prompt-AI**, you can efficiently manage AI-driven prompt handling, leveraging the Gemini model's capabilities with enhanced performance and scalability. Whether you\u00e2\u20ac\u2122re building a chatbot, a recommendation system, or any other AI-powered application, Prompt-AI provides a streamlined and powerful solution.\r\n\r\n## Happy Coding :)\r\n\r\n---\r\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A powerful library for managing AI-driven prompt handling and response generation, featuring structured database management and seamless Node.js integration.",
"version": "0.2.0",
"project_urls": {
"Homepage": "https://github.com/Gaurav-Codetek/chatflow.git"
},
"split_keywords": [
"python",
" chatbot",
" genai",
" generative ai",
" gemini api",
" chat ai"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "1727c2e916f71b2787f70e19c5c81555a99f0af793f9086b8a6201e3abed7c84",
"md5": "5a6aa7a07c202dab6947ea699ec9c7b4",
"sha256": "5a3123d22c3f2150e8c00cd0b85f000eb147856807ad28255e30a28613c221aa"
},
"downloads": -1,
"filename": "prompt_ai-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5a6aa7a07c202dab6947ea699ec9c7b4",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 7380,
"upload_time": "2024-08-21T17:45:50",
"upload_time_iso_8601": "2024-08-21T17:45:50.571001Z",
"url": "https://files.pythonhosted.org/packages/17/27/c2e916f71b2787f70e19c5c81555a99f0af793f9086b8a6201e3abed7c84/prompt_ai-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b8482649fe5b1c49ba48e278c92ec9ed13c76b81cd6d591a827049b0b40136b4",
"md5": "b92e75b5a91266a0c6646c097b4e2d3b",
"sha256": "0d7a00e97ebf03073846219c0f58a00c210a9a534ea4dbcaae89a7db8537f28c"
},
"downloads": -1,
"filename": "prompt-ai-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "b92e75b5a91266a0c6646c097b4e2d3b",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 5540,
"upload_time": "2024-08-21T17:45:52",
"upload_time_iso_8601": "2024-08-21T17:45:52.495993Z",
"url": "https://files.pythonhosted.org/packages/b8/48/2649fe5b1c49ba48e278c92ec9ed13c76b81cd6d591a827049b0b40136b4/prompt-ai-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-21 17:45:52",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Gaurav-Codetek",
"github_project": "chatflow",
"github_not_found": true,
"lcname": "prompt-ai"
}