# ๐ง Django Supportal โ AI-Powered Business Support Chat APIs for django projects
**Django Supportal** is an intelligent, AI-powered customer support system built with **Django**, **Django Channels**, and **OpenAI API**.
It provides APIs for businesses to upload their internal documents, and a smart assistant will handle customer inquiries via live chat โ powered by a Retrieval-Augmented Generation (RAG) system.
---
## ๐ Features
- โ
Real-time chat via **Django Channels (WebSockets)**
- ๐ Businesses can upload **PDF, DOCX, or TXT documents**
- ๐ค Uses **OpenAI GPT models** to provide intelligent responses
- ๐ Implements **RAG (Retrieval-Augmented Generation)** to process custom business knowledge
- ๐ Secured communication and Redis-based event layer
---
## ๐ง How it Works (RAG Architecture)
Supportal uses a **Retrieval-Augmented Generation (RAG)** approach to enable AI to answer business-specific questions:
1. **Document Upload:**
Businesses upload documents such as FAQs, product guides, manuals, or policies.
2. **Chunking & Embedding:**
Uploaded documents are:
- Split into smaller text chunks
- Converted into **vector embeddings** using OpenAI's `text-embedding` models
3. **Vector Storage:**
Embeddings are stored in a **vector database** (like FAISS) for fast similarity search.
4. **Chat Inference:**
- When a customer sends a message, it's embedded and compared against stored chunks.
- The most relevant chunks are selected as **context**.
- The context is fed into OpenAI's **chat completion API** along with the user's question.
- A tailored, relevant answer is generated based on actual business documents.
> This allows Supportal to **answer domain-specific questions accurately**, beyond what a generic AI model can do.
---
## ๐ ๏ธ Tech Stack
- **Backend:** Django + Django Channels
- **Realtime Layer:** Redis (via `channels_redis`)
- **AI Engine:** OpenAI API (GPT + Embeddings)
- **Vector DB:** FAISS (in-memory vector search)
---
## ๐ฆ Getting Started
### ๐ง Prerequisites
- Django 4.2.2
- Channels
- Celery
- Redis
- OpenAI API key
### ๐งช Installation
```bash
Installation guide will be added after publish to pypi
```
## ๐ License
This project is licensed under the **MIT License** โ see the [LICENSE](./LICENSE) file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "django-supportal",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": "django, ai, chat, support, openai, rag",
"author": "Amin Dehghan Dehnavi",
"author_email": "amin.dehghandehnavi@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/d2/53/777da120c14fd8dba05795d5394cdfcb0a3846e8ff93ad1a81c5e4e3a55c/django_supportal-1.0.0.tar.gz",
"platform": null,
"description": "# \ud83e\udde0 Django Supportal \u2013 AI-Powered Business Support Chat APIs for django projects\n\n**Django Supportal** is an intelligent, AI-powered customer support system built with **Django**, **Django Channels**, and **OpenAI API**. \nIt provides APIs for businesses to upload their internal documents, and a smart assistant will handle customer inquiries via live chat \u2013 powered by a Retrieval-Augmented Generation (RAG) system.\n\n---\n\n## \ud83d\ude80 Features\n\n- \u2705 Real-time chat via **Django Channels (WebSockets)**\n- \ud83d\udcce Businesses can upload **PDF, DOCX, or TXT documents**\n- \ud83e\udd16 Uses **OpenAI GPT models** to provide intelligent responses\n- \ud83d\udcda Implements **RAG (Retrieval-Augmented Generation)** to process custom business knowledge\n- \ud83d\udd12 Secured communication and Redis-based event layer\n\n---\n\n## \ud83e\udde0 How it Works (RAG Architecture)\n\nSupportal uses a **Retrieval-Augmented Generation (RAG)** approach to enable AI to answer business-specific questions:\n\n1. **Document Upload:** \n Businesses upload documents such as FAQs, product guides, manuals, or policies.\n\n2. **Chunking & Embedding:** \n Uploaded documents are:\n - Split into smaller text chunks\n - Converted into **vector embeddings** using OpenAI's `text-embedding` models\n\n3. **Vector Storage:** \n Embeddings are stored in a **vector database** (like FAISS) for fast similarity search.\n\n4. **Chat Inference:**\n - When a customer sends a message, it's embedded and compared against stored chunks.\n - The most relevant chunks are selected as **context**.\n - The context is fed into OpenAI's **chat completion API** along with the user's question.\n - A tailored, relevant answer is generated based on actual business documents.\n\n> This allows Supportal to **answer domain-specific questions accurately**, beyond what a generic AI model can do.\n\n---\n\n## \ud83d\udee0\ufe0f Tech Stack\n\n- **Backend:** Django + Django Channels\n- **Realtime Layer:** Redis (via `channels_redis`)\n- **AI Engine:** OpenAI API (GPT + Embeddings)\n- **Vector DB:** FAISS (in-memory vector search)\n\n---\n\n## \ud83d\udce6 Getting Started\n\n### \ud83d\udd27 Prerequisites\n\n- Django 4.2.2\n- Channels\n- Celery\n- Redis\n- OpenAI API key\n\n### \ud83e\uddea Installation\n\n```bash\nInstallation guide will be added after publish to pypi\n```\n\n## \ud83d\udcc4 License\nThis project is licensed under the **MIT License** \u2013 see the [LICENSE](./LICENSE) file for details.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "AI-Powered business support chat system for django",
"version": "1.0.0",
"project_urls": {
"Homepage": "https://github.com/mrsins02/django_supportal",
"Repository": "https://github.com/mrsins02/django_supportal"
},
"split_keywords": [
"django",
" ai",
" chat",
" support",
" openai",
" rag"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "da0888005d4a552a17db7418443fe6047d866d3683a08618a35010f429ffb1d2",
"md5": "00e7558e3cc2732130e942d2a78e8baf",
"sha256": "2668f8548f79192c4625d073296574a99034dec808f96f5a8e883e6f34450994"
},
"downloads": -1,
"filename": "django_supportal-1.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "00e7558e3cc2732130e942d2a78e8baf",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 24115,
"upload_time": "2025-07-15T12:53:11",
"upload_time_iso_8601": "2025-07-15T12:53:11.710394Z",
"url": "https://files.pythonhosted.org/packages/da/08/88005d4a552a17db7418443fe6047d866d3683a08618a35010f429ffb1d2/django_supportal-1.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "d253777da120c14fd8dba05795d5394cdfcb0a3846e8ff93ad1a81c5e4e3a55c",
"md5": "b3462a89ce0120c4be8e7fbf3b5cecb7",
"sha256": "88ceb7d8b9133993bea785aaa582c5174ea931f284df7ff0760e28d589202f4d"
},
"downloads": -1,
"filename": "django_supportal-1.0.0.tar.gz",
"has_sig": false,
"md5_digest": "b3462a89ce0120c4be8e7fbf3b5cecb7",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 19064,
"upload_time": "2025-07-15T12:53:12",
"upload_time_iso_8601": "2025-07-15T12:53:12.956004Z",
"url": "https://files.pythonhosted.org/packages/d2/53/777da120c14fd8dba05795d5394cdfcb0a3846e8ff93ad1a81c5e4e3a55c/django_supportal-1.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-15 12:53:12",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "mrsins02",
"github_project": "django_supportal",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "django-supportal"
}