# Entity — by Project David
[](https://github.com/frankie336/entitites_sdk/actions/workflows/test_tag_release.yml)
[](https://polyformproject.org/licenses/noncommercial/1.0.0/)
The **Entity SDK** is a composable, Pythonic interface to the [Entities API](https://github.com/frankie336/entities_api) for building intelligent applications across **local, open-source**, and **cloud LLMs**.
It unifies:
- Users, threads, assistants, messages, runs, inference
- **Function calling**, **code interpretation**, and **structured streaming**
- Vector memory, file uploads, and secure tool orchestration
Local inference is fully supported via [Ollama](https://github.com/ollama).
---
## 🔌 Supported Inference Providers
| Provider | Type |
|--------------------------------------------------|--------------------------|
| [Ollama](https://github.com/ollama) | **Local** (Self-Hosted) |
| [DeepSeek](https://platform.deepseek.com/) | ☁ **Cloud** (Open-Source) |
| [Hyperbolic](https://hyperbolic.xyz/) | ☁ **Cloud** (Proprietary) |
| [OpenAI](https://platform.openai.com/) | ☁ **Cloud** (Proprietary) |
| [Together AI](https://www.together.ai/) | ☁ **Cloud** (Aggregated) |
| [Azure Foundry](https://azure.microsoft.com) | ☁ **Cloud** (Enterprise) |
---
## 📦 Installation
```bash
pip install projectdavid
```
---
## Quick Start
```python
import os
from dotenv import load_dotenv
from projectdavid import Entity
load_dotenv()
# --------------------------------------------------
# Load the Entities client with your user API key
# Note: if you define ENTITIES_API_KEY="ea_6zZiZ..."
# in .env, you do not need to pass in the API key directly.
# We pass in here directly for clarity
# ---------------------------------------------------
client = Entity(base_url="http://localhost:9000", api_key=os.getenv("ENTITIES_API_KEY"))
user_id = "user_kUKV8octgG2aMc7kxAcD3i"
# -----------------------------
# create an assistant
# ------------------------------
assistant = client.assistants.create_assistant(
name="test_assistant",
instructions="You are a helpful AI assistant",
)
print(f"created assistant with ID: {assistant.id}")
# -----------------------------------------------
# Create a thread
# Note:
# - Threads are re-usable
# Reuse threads in the case you want as continued
# multi turn conversation
# ------------------------------------------------
print("Creating thread...")
thread = client.threads.create_thread(participant_ids=[user_id])
print(f"created thread with ID: {thread.id}")
# Store the dynamically created thread ID
actual_thread_id = thread.id
# -----------------------------------------
# Create a message using the NEW thread ID
# --------------------------------------------
print(f"Creating message in thread {actual_thread_id}...")
message = client.messages.create_message(
thread_id=actual_thread_id,
role="user",
content="Hello, assistant! Tell me about the latest trends in AI.",
assistant_id=assistant.id,
)
print(f"Created message with ID: {message.id}")
# ---------------------------------------------
# step 3 - Create a run using the NEW thread ID
# ----------------------------------------------
print(f"Creating run in thread {actual_thread_id}...")
run = client.runs.create_run(assistant_id=assistant.id, thread_id=actual_thread_id)
print(f"Created run with ID: {run.id}")
# ------------------------------------------------
# Instantiate the synchronous streaming helper
# --------------------------------------------------
sync_stream = client.synchronous_inference_stream
# ------------------------------------------------------
# step 4 - Set up the stream using the NEW thread ID
# --------------------------------------------------------
print(f"Setting up stream for thread {actual_thread_id}...")
sync_stream.setup(
user_id=user_id,
thread_id=actual_thread_id,
assistant_id=assistant.id,
message_id=message.id,
run_id=run.id,
api_key=os.getenv("HYPERBOLIC_API_KEY"),
)
print("Stream setup complete. Starting streaming...")
# --- Stream initial LLM response ---
try:
for chunk in sync_stream.stream_chunks(
provider="Hyperbolic",
model="hyperbolic/deepseek-ai/DeepSeek-V3-0324", # Ensure this model is valid/available
timeout_per_chunk=15.0,
):
content = chunk.get("content", "")
if content:
print(content, end="", flush=True)
print("\n--- End of Stream ---") # Add newline after stream
except Exception as e:
print(f"\n--- Stream Error: {e} ---") # Catch errors during streaming
print("Script finished.")
```
**The assisants response**:
Hello! The field of AI is evolving rapidly, and here are some of the latest trends as of early 2025:
### 1. **Multimodal AI Models**
- Models like GPT-4, Gemini, and others now seamlessly process text, images, audio, and video in a unified way, enabling richer interactions (e.g., ChatGPT with vision).
- Applications include real-time translation with context, AI-generated video synthesis, and more immersive virtual assistants.
### 2. **Smaller, More Efficient Models**
- While giant models (e.g., GPT-4, Claude 3) still dominate, there’s a push for smaller, specialized models (e.g., Microsoft’s Phi-3, Mistral 7B) that run locally on devices with near-LLM performance.
- Focus on **energy efficiency** and reduced computational costs.
### 3. **AI Agents & Autonomous Systems**
- AI “agents” (e.g., OpenAI’s “Agentic workflows”) can now perform multi-step tasks autonomously, like coding, research, or booking trips.
- Companies are integrating agentic AI into workflows (e.g., Salesforce, Notion AI).
### 4. **Generative AI Advancements**
- **Video generation**: Tools like OpenAI’s Sora, Runway ML, and Pika Labs produce high-quality, longer AI-generated videos.
- **3D asset creation**: AI can now generate 3D models from text prompts (e.g., Nvidia’s tools).
- **Voice cloning**: Ultra-realistic voice synthesis (e.g., ElevenLabs) is raising ethical debates.
### 5. **Regulation & Ethical AI**
- Governments are catching up with laws like the EU AI Act and U.S. executive orders on AI safety.
- Watermarking AI content (e.g., C2PA standards) is gaining traction to combat deepfakes.
### 6. **AI in Science & Healthcare**
- AlphaFold 3 (DeepMind) predicts protein interactions with unprecedented accuracy.
- AI-driven drug discovery (e.g., Insilico Medicine) is accelerating clinical trials.
### 7. **Open-Source vs. Closed AI**
- Tension between open-source (Mistral, Meta’s Llama 3) and proprietary models (GPT-4, Gemini) continues, with debates over safety and innovation.
### 8. **AI Hardware Innovations**
- New chips (e.g., Nvidia’s Blackwell, Groq’s LPUs) are optimizing speed and cost for AI workloads.
- “AI PCs” with NPUs (neural processing units) are becoming mainstream.
### 9. **Personalized AI**
- Tailored AI assistants learn individual preferences (e.g., Rabbit R1, Humane AI Pin).
- Privacy-focused local AI (e.g., Apple’s on-device AI in iOS 18).
### 10. **Quantum AI (Early Stages)**
- Companies like Google and IBM are exploring quantum machine learning, though practical applications remain limited.
Would you like a deeper dive into any of these trends?
---
## 📚 Documentation
| Domain | Link |
|---------------------|--------------------------------------------------------|
| Assistants | [assistants.md](/docs/assistants.md) |
| Threads | [threads.md](/docs/threads.md) |
| Messages | [messages.md](/docs/messages.md) |
| Runs | [runs.md](/docs/runs.md) |
| Inference | [inference.md](/docs/inference.md) |
| Streaming | [streams.md](/docs/streams.md) |
| Function Calling | [function_calls.md](/docs/function_calls.md) |
| Code Interpretation | [code_interpretation.md](/docs/code_interpretation.md) |
| Files | [files.md](/docs/files.md) |
| Vector Store(RAG) | [vector_store.md](/docs/vector_store.md) |
| Versioning | [versioning.md](/docs/versioning.md) |
---
## ✅ Compatibility & Requirements
- Python **3.10+**
- Compatible with **local** or **cloud** deployments of the Entities API
---
## 🌍 Related Repositories
- 🔌 [Entities API](https://github.com/frankie336/entities_api) — containerized API backend
-
- 📚 [entities_common](https://github.com/frankie336/entities_common) — shared validation, schemas, utilities, and tools.
This package is auto installed as dependency of entities SDK or entities API.
Raw data
{
"_id": null,
"home_page": null,
"name": "projectdavid",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "AI, SDK, Entities, LLM, Assistant",
"author": null,
"author_email": "Francis Neequaye Armah <francis.neequaye@projectdavid.co.uk>",
"download_url": "https://files.pythonhosted.org/packages/a5/e5/5c1ec1647088314118ae8838a3241535289eee5e7fea296697e57ba864e2/projectdavid-1.33.31.tar.gz",
"platform": null,
"description": "# Entity \u2014 by Project David\n\n[](https://github.com/frankie336/entitites_sdk/actions/workflows/test_tag_release.yml)\n[](https://polyformproject.org/licenses/noncommercial/1.0.0/)\n\nThe **Entity SDK** is a composable, Pythonic interface to the [Entities API](https://github.com/frankie336/entities_api) for building intelligent applications across **local, open-source**, and **cloud LLMs**.\n\nIt unifies:\n\n- Users, threads, assistants, messages, runs, inference\n- **Function calling**, **code interpretation**, and **structured streaming**\n- Vector memory, file uploads, and secure tool orchestration\n\nLocal inference is fully supported via [Ollama](https://github.com/ollama).\n\n---\n\n## \ud83d\udd0c Supported Inference Providers\n\n| Provider | Type |\n|--------------------------------------------------|--------------------------|\n| [Ollama](https://github.com/ollama) | **Local** (Self-Hosted) |\n| [DeepSeek](https://platform.deepseek.com/) | \u2601 **Cloud** (Open-Source) |\n| [Hyperbolic](https://hyperbolic.xyz/) | \u2601 **Cloud** (Proprietary) |\n| [OpenAI](https://platform.openai.com/) | \u2601 **Cloud** (Proprietary) |\n| [Together AI](https://www.together.ai/) | \u2601 **Cloud** (Aggregated) |\n| [Azure Foundry](https://azure.microsoft.com) | \u2601 **Cloud** (Enterprise) |\n\n---\n\n## \ud83d\udce6 Installation\n\n```bash\npip install projectdavid\n\n```\n\n---\n\n## Quick Start\n\n```python\nimport os\n\nfrom dotenv import load_dotenv\nfrom projectdavid import Entity\n\nload_dotenv()\n\n# --------------------------------------------------\n# Load the Entities client with your user API key\n# Note: if you define ENTITIES_API_KEY=\"ea_6zZiZ...\"\n# in .env, you do not need to pass in the API key directly.\n# We pass in here directly for clarity\n# ---------------------------------------------------\nclient = Entity(base_url=\"http://localhost:9000\", api_key=os.getenv(\"ENTITIES_API_KEY\"))\n\nuser_id = \"user_kUKV8octgG2aMc7kxAcD3i\"\n\n# -----------------------------\n# create an assistant\n# ------------------------------\nassistant = client.assistants.create_assistant(\n name=\"test_assistant\",\n instructions=\"You are a helpful AI assistant\",\n)\nprint(f\"created assistant with ID: {assistant.id}\")\n\n# -----------------------------------------------\n# Create a thread\n# Note:\n# - Threads are re-usable\n# Reuse threads in the case you want as continued\n# multi turn conversation\n# ------------------------------------------------\nprint(\"Creating thread...\")\nthread = client.threads.create_thread(participant_ids=[user_id])\n\nprint(f\"created thread with ID: {thread.id}\")\n# Store the dynamically created thread ID\nactual_thread_id = thread.id\n\n\n# -----------------------------------------\n# Create a message using the NEW thread ID\n# --------------------------------------------\nprint(f\"Creating message in thread {actual_thread_id}...\")\nmessage = client.messages.create_message(\n thread_id=actual_thread_id,\n role=\"user\",\n content=\"Hello, assistant! Tell me about the latest trends in AI.\",\n assistant_id=assistant.id,\n)\nprint(f\"Created message with ID: {message.id}\")\n\n# ---------------------------------------------\n# step 3 - Create a run using the NEW thread ID\n# ----------------------------------------------\nprint(f\"Creating run in thread {actual_thread_id}...\")\nrun = client.runs.create_run(assistant_id=assistant.id, thread_id=actual_thread_id)\nprint(f\"Created run with ID: {run.id}\")\n\n# ------------------------------------------------\n# Instantiate the synchronous streaming helper\n# --------------------------------------------------\nsync_stream = client.synchronous_inference_stream\n\n# ------------------------------------------------------\n# step 4 - Set up the stream using the NEW thread ID\n# --------------------------------------------------------\nprint(f\"Setting up stream for thread {actual_thread_id}...\")\nsync_stream.setup(\n user_id=user_id,\n thread_id=actual_thread_id,\n assistant_id=assistant.id,\n message_id=message.id,\n run_id=run.id,\n api_key=os.getenv(\"HYPERBOLIC_API_KEY\"),\n)\nprint(\"Stream setup complete. Starting streaming...\")\n\n# --- Stream initial LLM response ---\ntry:\n for chunk in sync_stream.stream_chunks(\n provider=\"Hyperbolic\",\n model=\"hyperbolic/deepseek-ai/DeepSeek-V3-0324\", # Ensure this model is valid/available\n timeout_per_chunk=15.0,\n ):\n content = chunk.get(\"content\", \"\")\n if content:\n print(content, end=\"\", flush=True)\n print(\"\\n--- End of Stream ---\") # Add newline after stream\nexcept Exception as e:\n print(f\"\\n--- Stream Error: {e} ---\") # Catch errors during streaming\n\nprint(\"Script finished.\")\n```\n\n**The assisants response**:\n\n\nHello! The field of AI is evolving rapidly, and here are some of the latest trends as of early 2025:\n\n### 1. **Multimodal AI Models** \n - Models like GPT-4, Gemini, and others now seamlessly process text, images, audio, and video in a unified way, enabling richer interactions (e.g., ChatGPT with vision). \n - Applications include real-time translation with context, AI-generated video synthesis, and more immersive virtual assistants.\n\n### 2. **Smaller, More Efficient Models** \n - While giant models (e.g., GPT-4, Claude 3) still dominate, there\u2019s a push for smaller, specialized models (e.g., Microsoft\u2019s Phi-3, Mistral 7B) that run locally on devices with near-LLM performance. \n - Focus on **energy efficiency** and reduced computational costs.\n\n### 3. **AI Agents & Autonomous Systems** \n - AI \u201cagents\u201d (e.g., OpenAI\u2019s \u201cAgentic workflows\u201d) can now perform multi-step tasks autonomously, like coding, research, or booking trips. \n - Companies are integrating agentic AI into workflows (e.g., Salesforce, Notion AI).\n\n### 4. **Generative AI Advancements** \n - **Video generation**: Tools like OpenAI\u2019s Sora, Runway ML, and Pika Labs produce high-quality, longer AI-generated videos. \n - **3D asset creation**: AI can now generate 3D models from text prompts (e.g., Nvidia\u2019s tools). \n - **Voice cloning**: Ultra-realistic voice synthesis (e.g., ElevenLabs) is raising ethical debates.\n\n### 5. **Regulation & Ethical AI** \n - Governments are catching up with laws like the EU AI Act and U.S. executive orders on AI safety. \n - Watermarking AI content (e.g., C2PA standards) is gaining traction to combat deepfakes.\n\n### 6. **AI in Science & Healthcare** \n - AlphaFold 3 (DeepMind) predicts protein interactions with unprecedented accuracy. \n - AI-driven drug discovery (e.g., Insilico Medicine) is accelerating clinical trials.\n\n### 7. **Open-Source vs. Closed AI** \n - Tension between open-source (Mistral, Meta\u2019s Llama 3) and proprietary models (GPT-4, Gemini) continues, with debates over safety and innovation.\n\n### 8. **AI Hardware Innovations** \n - New chips (e.g., Nvidia\u2019s Blackwell, Groq\u2019s LPUs) are optimizing speed and cost for AI workloads. \n - \u201cAI PCs\u201d with NPUs (neural processing units) are becoming mainstream.\n\n### 9. **Personalized AI** \n - Tailored AI assistants learn individual preferences (e.g., Rabbit R1, Humane AI Pin). \n - Privacy-focused local AI (e.g., Apple\u2019s on-device AI in iOS 18).\n\n### 10. **Quantum AI (Early Stages)** \n - Companies like Google and IBM are exploring quantum machine learning, though practical applications remain limited.\n\nWould you like a deeper dive into any of these trends?\n\n---\n\n\n\n## \ud83d\udcda Documentation\n\n| Domain | Link |\n|---------------------|--------------------------------------------------------|\n| Assistants | [assistants.md](/docs/assistants.md) |\n| Threads | [threads.md](/docs/threads.md) |\n| Messages | [messages.md](/docs/messages.md) |\n| Runs | [runs.md](/docs/runs.md) |\n| Inference | [inference.md](/docs/inference.md) |\n| Streaming | [streams.md](/docs/streams.md) |\n| Function Calling | [function_calls.md](/docs/function_calls.md) |\n| Code Interpretation | [code_interpretation.md](/docs/code_interpretation.md) |\n| Files | [files.md](/docs/files.md) |\n| Vector Store(RAG) | [vector_store.md](/docs/vector_store.md) |\n| Versioning | [versioning.md](/docs/versioning.md) |\n\n---\n\n## \u2705 Compatibility & Requirements\n\n- Python **3.10+**\n- Compatible with **local** or **cloud** deployments of the Entities API\n\n---\n\n## \ud83c\udf0d Related Repositories\n\n- \ud83d\udd0c [Entities API](https://github.com/frankie336/entities_api) \u2014 containerized API backend\n- \n- \ud83d\udcda [entities_common](https://github.com/frankie336/entities_common) \u2014 shared validation, schemas, utilities, and tools.\n This package is auto installed as dependency of entities SDK or entities API.\n",
"bugtrack_url": null,
"license": "PolyForm Noncommercial License 1.0.0",
"summary": "Python SDK for interacting with the Entities Assistant API.",
"version": "1.33.31",
"project_urls": null,
"split_keywords": [
"ai",
" sdk",
" entities",
" llm",
" assistant"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "b0021b1359814ea85ebc145143f005d48d351d5703b93fbb5e5d15aeab17f6e7",
"md5": "2bcdca1928eca75aaa519d543011c8a1",
"sha256": "a760aaeda085c1cb40fda70868a1eac22a4aa25c02f20c079b3c74360a2febf5"
},
"downloads": -1,
"filename": "projectdavid-1.33.31-py3-none-any.whl",
"has_sig": false,
"md5_digest": "2bcdca1928eca75aaa519d543011c8a1",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 84291,
"upload_time": "2025-07-08T16:23:18",
"upload_time_iso_8601": "2025-07-08T16:23:18.901522Z",
"url": "https://files.pythonhosted.org/packages/b0/02/1b1359814ea85ebc145143f005d48d351d5703b93fbb5e5d15aeab17f6e7/projectdavid-1.33.31-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "a5e55c1ec1647088314118ae8838a3241535289eee5e7fea296697e57ba864e2",
"md5": "5ea8e55794ee1754d7866d21af109c0b",
"sha256": "fe62029bed8d24e4517658137f3978a3846cb06d77789908a47d2839a83bd0a7"
},
"downloads": -1,
"filename": "projectdavid-1.33.31.tar.gz",
"has_sig": false,
"md5_digest": "5ea8e55794ee1754d7866d21af109c0b",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 109950,
"upload_time": "2025-07-08T16:23:20",
"upload_time_iso_8601": "2025-07-08T16:23:20.117809Z",
"url": "https://files.pythonhosted.org/packages/a5/e5/5c1ec1647088314118ae8838a3241535289eee5e7fea296697e57ba864e2/projectdavid-1.33.31.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-08 16:23:20",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "projectdavid"
}