| Name | neurosurf JSON |
| Version |
0.1.4
JSON |
| download |
| home_page | None |
| Summary | Neurosurf: production-ready AI agent framework with multi-LLM, RAG, tools, and FastAPI server |
| upload_time | 2025-11-02 18:20:30 |
| maintainer | None |
| docs_url | None |
| author | None |
| requires_python | >=3.9 |
| license | None |
| keywords |
llm
agents
rag
transformers
fastapi
chromadb
|
| VCS |
 |
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
<div align="center">
<img src="https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/neurosurf_logo_text_black.png" alt="Neurosurf — AI Agent Framework" width="50%"/>
<img src="https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/neurosurf_banner.svg" alt="Neurosurf — AI Agent Framework" width="100%"/>
<a href="https://naumanhsa.github.io/neurosurf/#quick-start"><img src="https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/quick_start_button.png" height="40" alt="Quick Start"></a>
<a href="https://naumanhsa.github.io/neurosurf/examples/"><img src="https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/examples_button.png" height="40" alt="Examples"></a>
<a href="https://naumanhsa.github.io/neurosurf/"><img src="https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/documentation_button.png" height="40" alt="Documentation"></a>
<a href="https://pypi.org/project/neurosurf/"><img src="https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/pypi_button.png" height="40" alt="PyPI"></a>
<a href=""><img src="https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/Discord_button.png" height="40" alt="Discord"></a>
</div>
**Neurosurf** helps you build intelligent apps that blend **LLM reasoning**, **tools**, and **retrieval** with a ready-to-run **FastAPI** backend and a **React** dev UI. Start lean, add power as you go — CPU-only or GPU-accelerated.
- 🧩 **OpenAI-style API** with streaming & tool-calling
- 📚 **RAG-ready**: ingest → chunk → retrieve → augment
- 🤖 **Agents** (ReAct, SQL, RAG) + 🔧 **Tools** (calc, web, custom)
- 🧠 **Multi-LLM**: OpenAI, Transformers/Unsloth, vLLM, Llama.cpp, more
- 🖥️ **NeurowebUI** (React) for chat UX, threads, uploads
## 🗞️ News
- **CLI `serve` improvements** — run backend-only or UI-only, inject `VITE_BACKEND_URL` automatically. See [:link: CLI guide](cli.md).
- **Model registry & RAG hooks** — easier wiring for multi-model setups. See [:link: Example App](server/example-app.md).
- **Optional LLM stack** — install heavy deps only when you need them:
```bash
pip install "neurosurf[torch]"
```
> Looking for older updates? Check the repo **Releases** and **Changelog**.
## ⚡ Quick Start
A 60-second path from install → dev server → your first inference.
**Install (minimal core):**
```bash
pip install -U neurosurf
```
**Or full LLM stack (torch, transformers, bnb, unsloth):**
```bash
pip install -U "neurosurf[torch]"
```
**Run the dev server (backend + UI):**
```bash
neurosurf serve
```
- Auto-detects UI; pass `--ui-root` if needed. First run may `npm install`.
- Backend binds to config defaults; override with flags or envs.
**Hello LLM Example:**
```python
from neurosurf.models.chat_models.transformers import TransformersModel
llm = TransformersModel(
model_name="unsloth/Llama-3.2-1B-Instruct-bnb-4bit",
load_in_4bit=True
)
res = llm.ask(user_prompt="Say hi!", system_prompt="Be concise.", stream=False)
print(res.choices[0].message.content)
```
## 🏗️ High-Level Architecture
<div align="center">
<img alt="Neurosurf Architecture" src="https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/neurosurf_architecture_light.png" width="100%"/>
<p><strong>Neurosurf Architecture</strong></p>
</div>
## ✨ Key Features
- **Production API** — FastAPI backend with auth, chat APIs, and OpenAI-compatible endpoints → [Server setup](https://naumanhsa.github.io/neurosurf/server/)
- **Intelligent Agents** — Build ReAct, SQL, and RAG agents with minimal code, optimized for specific tasks → [Learn about agents](https://naumanhsa.github.io/neurosurf/api-reference/agents/)
- **Rich Tool Ecosystem** — Built-in tools (calculator, web calls, files) plus easy custom tools → [Explore tools](https://naumanhsa.github.io/neurosurf/api-reference/tools/)
- **RAG System** — Ingest, chunk, and retrieve relevant context for your LLMs → [RAG System](https://naumanhsa.github.io/neurosurf/api-reference/rag/)
- **Vector Databases** — Built-in ChromaDB with an extensible interface for other stores → [Vector stores](https://naumanhsa.github.io/neurosurf/api-reference/vectorstores/)
- **Multi-LLM Support** — OpenAI, Transformers/Unsloth, vLLM, Llama.cpp, and OpenAI-compatible APIs → [Model docs](https://naumanhsa.github.io/neurosurf/api-reference/models/)
## 📦 Install Options
**pip (recommended)**
```bash
pip install -U neurosurf
```
**pip + full LLM stack**
```bash
pip install -U "neurosurf[torch]"
```
**From source**
```bash
git clone https://github.com/NaumanHSA/neurosurf.git
cd neurosurf && pip install -e ".[torch]"
```
CUDA notes (Linux x86_64):
```bash
# Wheels bundle CUDA; you just need a compatible NVIDIA driver.
pip install -U torch --index-url https://download.pytorch.org/whl/cu124
# or CPU-only:
pip install -U torch --index-url https://download.pytorch.org/whl/cpu
```
## 📝 License
Licensed under **Apache-2.0**. See [`LICENSE`](https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/LICENSE).
## 🌟 Support
- ⭐ Star the project on [GitHub](https://github.com/NaumanHSA/neurosurf).
- 💬 Ask & share in **Discussions**: [Discussions](https://github.com/NaumanHSA/neurosurf/discussions).
- 🧠 Read the [Docs](https://naumanhsa.github.io/neurosurf/).
- 🐛 File [Issues](https://github.com/NaumanHSA/neurosurf/issues).
- 🔒 Security: report privately to **naumanhsa965@gmail.com**.
## 📚 Citation
If you use **Neurosurf** in your work, please cite:
```bibtex
@software{neurosurf,
author = {Nouman Ahsan and Neurosurf contributors},
title = {Neurosurf: A Production-Ready AI Agent Framework},
year = {2025},
url = {https://github.com/NaumanHSA/neurosurf},
version = {0.1.0},
license = {Apache-2.0}
}
```
---
<div align="center">
<sub>Built with ❤️ by the Neurosurf team
</div>
Raw data
{
"_id": null,
"home_page": null,
"name": "neurosurf",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": "Nouman Ahsan <naumanhsa965@gmail.com>",
"keywords": "LLM, agents, RAG, transformers, FastAPI, ChromaDB",
"author": null,
"author_email": "Nouman Ahsan <naumanhsa965@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/7c/88/de56a26497287b31910851523801d3cbc5d8006e2ed96d46a307ff18740f/neurosurf-0.1.4.tar.gz",
"platform": null,
"description": "<div align=\"center\">\n <img src=\"https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/neurosurf_logo_text_black.png\" alt=\"Neurosurf \u2014 AI Agent Framework\" width=\"50%\"/>\n <img src=\"https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/neurosurf_banner.svg\" alt=\"Neurosurf \u2014 AI Agent Framework\" width=\"100%\"/>\n \n <a href=\"https://naumanhsa.github.io/neurosurf/#quick-start\"><img src=\"https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/quick_start_button.png\" height=\"40\" alt=\"Quick Start\"></a>\n <a href=\"https://naumanhsa.github.io/neurosurf/examples/\"><img src=\"https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/examples_button.png\" height=\"40\" alt=\"Examples\"></a>\n <a href=\"https://naumanhsa.github.io/neurosurf/\"><img src=\"https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/documentation_button.png\" height=\"40\" alt=\"Documentation\"></a>\n <a href=\"https://pypi.org/project/neurosurf/\"><img src=\"https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/pypi_button.png\" height=\"40\" alt=\"PyPI\"></a>\n <a href=\"\"><img src=\"https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/Discord_button.png\" height=\"40\" alt=\"Discord\"></a>\n\n\n</div>\n\n**Neurosurf** helps you build intelligent apps that blend **LLM reasoning**, **tools**, and **retrieval** with a ready-to-run **FastAPI** backend and a **React** dev UI. Start lean, add power as you go \u2014 CPU-only or GPU-accelerated.\n\n- \ud83e\udde9 **OpenAI-style API** with streaming & tool-calling \n- \ud83d\udcda **RAG-ready**: ingest \u2192 chunk \u2192 retrieve \u2192 augment \n- \ud83e\udd16 **Agents** (ReAct, SQL, RAG) + \ud83d\udd27 **Tools** (calc, web, custom) \n- \ud83e\udde0 **Multi-LLM**: OpenAI, Transformers/Unsloth, vLLM, Llama.cpp, more \n- \ud83d\udda5\ufe0f **NeurowebUI** (React) for chat UX, threads, uploads\n\n## \ud83d\uddde\ufe0f News\n\n- **CLI `serve` improvements** \u2014 run backend-only or UI-only, inject `VITE_BACKEND_URL` automatically. See [:link: CLI guide](cli.md). \n- **Model registry & RAG hooks** \u2014 easier wiring for multi-model setups. See [:link: Example App](server/example-app.md). \n- **Optional LLM stack** \u2014 install heavy deps only when you need them: \n ```bash\n pip install \"neurosurf[torch]\"\n ```\n\n> Looking for older updates? Check the repo **Releases** and **Changelog**.\n\n\n## \u26a1 Quick Start\n\nA 60-second path from install \u2192 dev server \u2192 your first inference.\n\n**Install (minimal core):**\n```bash\npip install -U neurosurf\n```\n\n**Or full LLM stack (torch, transformers, bnb, unsloth):**\n```bash\npip install -U \"neurosurf[torch]\"\n```\n\n**Run the dev server (backend + UI):**\n```bash\nneurosurf serve\n```\n- Auto-detects UI; pass `--ui-root` if needed. First run may `npm install`. \n- Backend binds to config defaults; override with flags or envs.\n\n**Hello LLM Example:**\n```python\nfrom neurosurf.models.chat_models.transformers import TransformersModel\n\nllm = TransformersModel(\n model_name=\"unsloth/Llama-3.2-1B-Instruct-bnb-4bit\",\n load_in_4bit=True\n)\nres = llm.ask(user_prompt=\"Say hi!\", system_prompt=\"Be concise.\", stream=False)\nprint(res.choices[0].message.content)\n```\n\n## \ud83c\udfd7\ufe0f High-Level Architecture\n<div align=\"center\">\n <img alt=\"Neurosurf Architecture\" src=\"https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/docs/assets/neurosurf_architecture_light.png\" width=\"100%\"/>\n <p><strong>Neurosurf Architecture</strong></p>\n</div>\n\n## \u2728 Key Features\n\n- **Production API** \u2014 FastAPI backend with auth, chat APIs, and OpenAI-compatible endpoints \u2192 [Server setup](https://naumanhsa.github.io/neurosurf/server/)\n\n- **Intelligent Agents** \u2014 Build ReAct, SQL, and RAG agents with minimal code, optimized for specific tasks \u2192 [Learn about agents](https://naumanhsa.github.io/neurosurf/api-reference/agents/)\n\n- **Rich Tool Ecosystem** \u2014 Built-in tools (calculator, web calls, files) plus easy custom tools \u2192 [Explore tools](https://naumanhsa.github.io/neurosurf/api-reference/tools/)\n\n- **RAG System** \u2014 Ingest, chunk, and retrieve relevant context for your LLMs \u2192 [RAG System](https://naumanhsa.github.io/neurosurf/api-reference/rag/)\n\n- **Vector Databases** \u2014 Built-in ChromaDB with an extensible interface for other stores \u2192 [Vector stores](https://naumanhsa.github.io/neurosurf/api-reference/vectorstores/)\n\n- **Multi-LLM Support** \u2014 OpenAI, Transformers/Unsloth, vLLM, Llama.cpp, and OpenAI-compatible APIs \u2192 [Model docs](https://naumanhsa.github.io/neurosurf/api-reference/models/)\n\n## \ud83d\udce6 Install Options\n\n**pip (recommended)**\n```bash\npip install -U neurosurf\n```\n\n**pip + full LLM stack**\n```bash\npip install -U \"neurosurf[torch]\"\n```\n\n**From source**\n```bash\ngit clone https://github.com/NaumanHSA/neurosurf.git\ncd neurosurf && pip install -e \".[torch]\"\n```\n\nCUDA notes (Linux x86_64):\n```bash\n# Wheels bundle CUDA; you just need a compatible NVIDIA driver.\npip install -U torch --index-url https://download.pytorch.org/whl/cu124\n# or CPU-only:\npip install -U torch --index-url https://download.pytorch.org/whl/cpu\n```\n\n## \ud83d\udcdd License\n\nLicensed under **Apache-2.0**. See [`LICENSE`](https://raw.githubusercontent.com/NaumanHSA/neurosurf/main/LICENSE).\n\n## \ud83c\udf1f Support\n\n- \u2b50 Star the project on [GitHub](https://github.com/NaumanHSA/neurosurf).\n- \ud83d\udcac Ask & share in **Discussions**: [Discussions](https://github.com/NaumanHSA/neurosurf/discussions).\n- \ud83e\udde0 Read the [Docs](https://naumanhsa.github.io/neurosurf/).\n- \ud83d\udc1b File [Issues](https://github.com/NaumanHSA/neurosurf/issues).\n- \ud83d\udd12 Security: report privately to **naumanhsa965@gmail.com**.\n\n## \ud83d\udcda Citation\n\nIf you use **Neurosurf** in your work, please cite:\n\n```bibtex\n@software{neurosurf,\n author = {Nouman Ahsan and Neurosurf contributors},\n title = {Neurosurf: A Production-Ready AI Agent Framework},\n year = {2025},\n url = {https://github.com/NaumanHSA/neurosurf},\n version = {0.1.0},\n license = {Apache-2.0}\n}\n```\n\n---\n\n<div align=\"center\">\n <sub>Built with \u2764\ufe0f by the Neurosurf team\n</div>\n",
"bugtrack_url": null,
"license": null,
"summary": "Neurosurf: production-ready AI agent framework with multi-LLM, RAG, tools, and FastAPI server",
"version": "0.1.4",
"project_urls": {
"Documentation": "https://naumanhsa.github.io/neurosurf/",
"Homepage": "https://naumanhsa.github.io/neurosurf/",
"Issues": "https://github.com/NaumanHSA/neurosurf/issues",
"Repository": "https://github.com/NaumanHSA/neurosurf"
},
"split_keywords": [
"llm",
" agents",
" rag",
" transformers",
" fastapi",
" chromadb"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "a3303c998c2c3fa73cdbba499db9ddd6b17fcbf7d25cc2b246d8d9fddf3b7237",
"md5": "e540ce4818536a1c2c1a3a09d4376043",
"sha256": "cf3846fa36083ba0c34b297d1a24bdd2cc507b93ba940a1399627f71dbe6148d"
},
"downloads": -1,
"filename": "neurosurf-0.1.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e540ce4818536a1c2c1a3a09d4376043",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 592307,
"upload_time": "2025-11-02T18:20:28",
"upload_time_iso_8601": "2025-11-02T18:20:28.810941Z",
"url": "https://files.pythonhosted.org/packages/a3/30/3c998c2c3fa73cdbba499db9ddd6b17fcbf7d25cc2b246d8d9fddf3b7237/neurosurf-0.1.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "7c88de56a26497287b31910851523801d3cbc5d8006e2ed96d46a307ff18740f",
"md5": "6b8125fd5d8972340e0fcb93524c663a",
"sha256": "c89c1437ae76d0f7ddf7f65a0616e4094058f48af7e8937d3f7e8680daca2e6b"
},
"downloads": -1,
"filename": "neurosurf-0.1.4.tar.gz",
"has_sig": false,
"md5_digest": "6b8125fd5d8972340e0fcb93524c663a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 559409,
"upload_time": "2025-11-02T18:20:30",
"upload_time_iso_8601": "2025-11-02T18:20:30.621947Z",
"url": "https://files.pythonhosted.org/packages/7c/88/de56a26497287b31910851523801d3cbc5d8006e2ed96d46a307ff18740f/neurosurf-0.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-11-02 18:20:30",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "NaumanHSA",
"github_project": "neurosurf",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "neurosurf"
}