kailash-dataflow


Namekailash-dataflow JSON
Version 0.3.7 PyPI version JSON
download
home_pagehttps://github.com/Integrum-Global/kailash_python_sdk
SummaryWorkflow-native database framework for Kailash SDK
upload_time2025-08-02 03:17:24
maintainerNone
docs_urlNone
authorIntegrum
requires_python>=3.8
licenseApache-2.0 WITH Additional-Terms
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Kailash DataFlow

**Zero-Config Database Framework** - Django simplicity meets enterprise-grade production quality.

## πŸš€ Quick Start (60 seconds)

```python
from kailash_dataflow import DataFlow

# That's it! No configuration needed
db = DataFlow()

# Define your model
@db.model
class User:
    id: int
    name: str
    email: str

# DataFlow automatically creates:
# βœ… Database schema (PostgreSQL, MySQL, SQLite)
# βœ… Auto-migration system (PostgreSQL-only, production-ready)
# βœ… 9 workflow nodes per model (CRUD + bulk ops)
# βœ… Real SQL operations with security
# βœ… Connection pooling and transaction management
# βœ… MongoDB-style query builder (implemented!)
# βœ… Concurrent access protection with locking
# βœ… Schema state management with rollback
# ⚠️ Redis query cache (planned)
# ⚠️ Multi-database runtime (PostgreSQL only)
```

You now have a production-ready database layer!

## 🎯 What Makes DataFlow Different?

### Zero Configuration That Actually Works
```python
# Development? Uses SQLite automatically
db = DataFlow()  # Just works!

# Production? Reads from environment
# DATABASE_URL=postgresql://...
db = DataFlow()  # Still just works!

# Need control? Progressive enhancement
db = DataFlow(
    pool_size=50,
    read_replicas=['replica1', 'replica2'],
    monitoring=True
)
```

### Real Database Operations (Currently Available)
```python
# Traditional ORMs: Imperative code
User.objects.create(name="Alice")  # Django
user = User(name="Alice"); session.add(user)  # SQLAlchemy

# DataFlow: Workflow-native database operations
workflow = WorkflowBuilder()
workflow.add_node("UserCreateNode", "create_user", {
    "name": "Alice",
    "email": "alice@example.com"
})
workflow.add_node("UserListNode", "find_users", {
    "limit": 10,
    "offset": 0
})

# Real SQL is executed: INSERT INTO users (name, email) VALUES ($1, $2)
```

### MongoDB-Style Query Builder (NEW!)
```python
# Get QueryBuilder from any model
builder = User.query_builder()

# MongoDB-style operators
builder.where("age", "$gte", 18)
builder.where("status", "$in", ["active", "premium"])
builder.where("email", "$regex", "^[a-z]+@company\.com$")
builder.order_by("created_at", "DESC")
builder.limit(10)

# Generates optimized SQL for your database
sql, params = builder.build_select()
# PostgreSQL: SELECT * FROM "users" WHERE "age" >= $1 AND "status" IN ($2, $3) AND "email" ~ $4 ORDER BY "created_at" DESC LIMIT 10

# Works seamlessly with ListNode
workflow.add_node("UserListNode", "search", {
    "filter": {
        "age": {"$gte": 18},
        "status": {"$in": ["active", "premium"]},
        "email": {"$regex": "^admin"}
    }
})
```

### Database Requirements
```python
# Current limitation: PostgreSQL only for execution
db = DataFlow(database_url="postgresql://user:pass@localhost/db")

# Schema generation works for all databases
schema_sql = db.generate_complete_schema_sql("sqlite")  # βœ… Works
schema_sql = db.generate_complete_schema_sql("mysql")   # βœ… Works
schema_sql = db.generate_complete_schema_sql("postgresql")  # βœ… Works

# But execution currently requires PostgreSQL
runtime = LocalRuntime()
results, run_id = runtime.execute(workflow.build())  # βœ… PostgreSQL only
```

### Database Operations as Workflow Nodes
```python
# Traditional ORMs: Imperative code
user = User.objects.create(name="Alice")  # Django
user = User(name="Alice"); session.add(user)  # SQLAlchemy

# DataFlow: Workflow-native (9 nodes per model!)
workflow = WorkflowBuilder()
workflow.add_node("UserCreateNode", "create_user", {
    "name": "Alice",
    "email": "alice@example.com"
})
workflow.add_node("UserListNode", "find_users", {
    "filter": {"name": {"$like": "A%"}}
})
```

### Enterprise Configuration
```python
# Multi-tenancy configuration (query modification planned)
db = DataFlow(multi_tenant=True)

# Real SQL generation with security
db = DataFlow(
    database_url="postgresql://user:pass@localhost/db",
    pool_size=20,
    pool_max_overflow=30,
    monitoring=True,
    echo=False  # No SQL logging in production
)

# All generated nodes use parameterized queries for security
# INSERT INTO users (name, email) VALUES ($1, $2)  -- Safe from SQL injection
```

## 🚦 Implementation Status

### βœ… Currently Available (Production-Ready)
- **Database Schema Generation**: Complete CREATE TABLE for PostgreSQL, MySQL, SQLite
- **Auto-Migration System**: PostgreSQL-only, production-ready automatic schema synchronization
- **Real Database Operations**: All 9 CRUD + bulk nodes execute actual SQL
- **SQL Security**: Parameterized queries prevent SQL injection
- **Connection Management**: Connection pooling, DDL execution, error handling
- **Workflow Integration**: Full compatibility with WorkflowBuilder/LocalRuntime
- **Configuration System**: Zero-config to enterprise patterns
- **MongoDB-Style Query Builder**: Complete with all operators ($eq, $gt, $in, $regex, etc.)
- **Concurrent Access Protection**: Migration locking and atomic operations
- **Schema State Management**: Change detection, caching, and rollback capabilities

### ⚠️ Limitations
- **Database Runtime**: PostgreSQL execution only (schema generation works for all)
- **AsyncSQLDatabaseNode**: Current limitation requires PostgreSQL connection string

### πŸ”„ Planned Features (Roadmap)
- **Redis Query Caching**: `User.cached_query()` with automatic invalidation
- **Multi-Database Runtime**: SQLite/MySQL execution support
- **Advanced Multi-Tenancy**: Automatic query modification for tenant isolation

## πŸ“š Documentation

### Getting Started
- **[5-Minute Tutorial](docs/getting-started/quickstart.md)** - Build your first app
- **[Core Concepts](docs/getting-started/concepts.md)** - Understand DataFlow
- **[Examples](examples/)** - Complete applications

### Development
- **[Models](docs/development/models.md)** - Define your schema
- **[CRUD Operations](docs/development/crud.md)** - Basic operations
- **[Relationships](docs/development/relationships.md)** - Model associations

### Production
- **[Deployment](docs/production/deployment.md)** - Go to production
- **[Performance](docs/production/performance.md)** - Optimization guide
- **[Monitoring](docs/advanced/monitoring.md)** - Observability

## πŸ’‘ Real-World Examples

### E-Commerce Platform
```python
# Define your models
@db.model
class Product:
    id: int
    name: str
    price: float
    stock: int

@db.model
class Order:
    id: int
    user_id: int
    total: float
    status: str

# Use in workflows
workflow = WorkflowBuilder()

# Check inventory
workflow.add_node("ProductGetNode", "check_stock", {
    "id": "{product_id}"
})

# Create order with transaction
workflow.add_node("TransactionContextNode", "tx_start")
workflow.add_node("OrderCreateNode", "create_order", {
    "user_id": "{user_id}",
    "total": "{total}"
})
workflow.add_node("ProductUpdateNode", "update_stock", {
    "id": "{product_id}",
    "stock": "{new_stock}"
})
```

### Multi-Tenant SaaS (Current Implementation)
```python
# Enable multi-tenancy configuration
db = DataFlow(
    database_url="postgresql://user:pass@localhost/db",
    multi_tenant=True
)

# Multi-tenant models get tenant_id field automatically
@db.model
class User:
    name: str
    email: str
    # tenant_id: str automatically added

# Use in workflows with real database operations
workflow.add_node("UserCreateNode", "create_user", {
    "name": "Alice",
    "email": "alice@acme-corp.com"
})
workflow.add_node("UserListNode", "list_users", {
    "limit": 10,
    "filter": {}
})
```

### High-Performance ETL (Current Implementation)
```python
# Bulk operations with real database execution
workflow.add_node("UserBulkCreateNode", "import_users", {
    "data": users_data,  # List of user records
    "batch_size": 1000,
    "conflict_resolution": "skip"
})

# Real bulk INSERT operations executed
# Uses parameterized queries for security
# Processes data in configurable batches

# List operations with filters
workflow.add_node("UserListNode", "active_users", {
    "limit": 1000,
    "offset": 0,
    "order_by": ["created_at"],
    "filter": {"active": True}
})
```

## πŸ—οΈ Architecture

DataFlow seamlessly integrates with Kailash's workflow architecture:

```
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                 Your Application                     β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚                    DataFlow                          β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”         β”‚
β”‚  β”‚  Models  β”‚  β”‚   Nodes  β”‚  β”‚ Migrationsβ”‚         β”‚
β”‚  β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜         β”‚
β”‚       β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜               β”‚
β”‚                Core Features                         β”‚
β”‚  QueryBuilder β”‚ QueryCache β”‚ Monitoring β”‚ Multi-tenant β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”         β”‚
β”‚  β”‚MongoDB-  β”‚  β”‚Redis     β”‚  β”‚Pattern   β”‚         β”‚
β”‚  β”‚style     β”‚  β”‚Caching   β”‚  β”‚Invalidateβ”‚         β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜         β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚               Kailash SDK                           β”‚
β”‚         Workflows β”‚ Nodes β”‚ Runtime                 β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
```

## πŸ§ͺ Testing

DataFlow includes comprehensive testing support:

```python
# Test with in-memory database
def test_user_creation():
    db = DataFlow(testing=True)

    @db.model
    class User:
        id: int
        name: str

    # Automatic test isolation
    user = db.test_create(User, name="Test User")
    assert user.name == "Test User"
```

## 🀝 Contributing

We welcome contributions! DataFlow follows Kailash SDK patterns:

1. Use SDK components and patterns
2. Maintain zero-config philosophy
3. Write comprehensive tests
4. Update documentation

See [CONTRIBUTING.md](CONTRIBUTING.md) for details.

## πŸ“Š Performance

DataFlow provides real database performance with PostgreSQL:

- **Real SQL execution** with parameterized queries
- **Connection pooling** with configurable pool sizes
- **Bulk operations** with batching for large datasets
- **Production-ready** database operations

Performance testing requires PostgreSQL database setup.
Advanced caching and query optimization features are planned.

## ⚑ Why DataFlow?

- **Real Database Operations**: Actual SQL execution, not mocks
- **Workflow-Native**: Database ops as first-class nodes
- **Production-Ready**: PostgreSQL support with connection pooling
- **Progressive**: Simple to start, enterprise features available
- **100% Kailash**: Built on proven SDK components

---

**Built with Kailash SDK** | [Parent Project](../../README.md) | [SDK Docs](../../sdk-users/)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Integrum-Global/kailash_python_sdk",
    "name": "kailash-dataflow",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": "Integrum",
    "author_email": "info@integrum.com",
    "download_url": "https://files.pythonhosted.org/packages/d3/f0/1c686ea01412cf9f9c5c306844e92e49b4c05b5af66ba7f3861450b6fb53/kailash_dataflow-0.3.7.tar.gz",
    "platform": null,
    "description": "# Kailash DataFlow\n\n**Zero-Config Database Framework** - Django simplicity meets enterprise-grade production quality.\n\n## \ud83d\ude80 Quick Start (60 seconds)\n\n```python\nfrom kailash_dataflow import DataFlow\n\n# That's it! No configuration needed\ndb = DataFlow()\n\n# Define your model\n@db.model\nclass User:\n    id: int\n    name: str\n    email: str\n\n# DataFlow automatically creates:\n# \u2705 Database schema (PostgreSQL, MySQL, SQLite)\n# \u2705 Auto-migration system (PostgreSQL-only, production-ready)\n# \u2705 9 workflow nodes per model (CRUD + bulk ops)\n# \u2705 Real SQL operations with security\n# \u2705 Connection pooling and transaction management\n# \u2705 MongoDB-style query builder (implemented!)\n# \u2705 Concurrent access protection with locking\n# \u2705 Schema state management with rollback\n# \u26a0\ufe0f Redis query cache (planned)\n# \u26a0\ufe0f Multi-database runtime (PostgreSQL only)\n```\n\nYou now have a production-ready database layer!\n\n## \ud83c\udfaf What Makes DataFlow Different?\n\n### Zero Configuration That Actually Works\n```python\n# Development? Uses SQLite automatically\ndb = DataFlow()  # Just works!\n\n# Production? Reads from environment\n# DATABASE_URL=postgresql://...\ndb = DataFlow()  # Still just works!\n\n# Need control? Progressive enhancement\ndb = DataFlow(\n    pool_size=50,\n    read_replicas=['replica1', 'replica2'],\n    monitoring=True\n)\n```\n\n### Real Database Operations (Currently Available)\n```python\n# Traditional ORMs: Imperative code\nUser.objects.create(name=\"Alice\")  # Django\nuser = User(name=\"Alice\"); session.add(user)  # SQLAlchemy\n\n# DataFlow: Workflow-native database operations\nworkflow = WorkflowBuilder()\nworkflow.add_node(\"UserCreateNode\", \"create_user\", {\n    \"name\": \"Alice\",\n    \"email\": \"alice@example.com\"\n})\nworkflow.add_node(\"UserListNode\", \"find_users\", {\n    \"limit\": 10,\n    \"offset\": 0\n})\n\n# Real SQL is executed: INSERT INTO users (name, email) VALUES ($1, $2)\n```\n\n### MongoDB-Style Query Builder (NEW!)\n```python\n# Get QueryBuilder from any model\nbuilder = User.query_builder()\n\n# MongoDB-style operators\nbuilder.where(\"age\", \"$gte\", 18)\nbuilder.where(\"status\", \"$in\", [\"active\", \"premium\"])\nbuilder.where(\"email\", \"$regex\", \"^[a-z]+@company\\.com$\")\nbuilder.order_by(\"created_at\", \"DESC\")\nbuilder.limit(10)\n\n# Generates optimized SQL for your database\nsql, params = builder.build_select()\n# PostgreSQL: SELECT * FROM \"users\" WHERE \"age\" >= $1 AND \"status\" IN ($2, $3) AND \"email\" ~ $4 ORDER BY \"created_at\" DESC LIMIT 10\n\n# Works seamlessly with ListNode\nworkflow.add_node(\"UserListNode\", \"search\", {\n    \"filter\": {\n        \"age\": {\"$gte\": 18},\n        \"status\": {\"$in\": [\"active\", \"premium\"]},\n        \"email\": {\"$regex\": \"^admin\"}\n    }\n})\n```\n\n### Database Requirements\n```python\n# Current limitation: PostgreSQL only for execution\ndb = DataFlow(database_url=\"postgresql://user:pass@localhost/db\")\n\n# Schema generation works for all databases\nschema_sql = db.generate_complete_schema_sql(\"sqlite\")  # \u2705 Works\nschema_sql = db.generate_complete_schema_sql(\"mysql\")   # \u2705 Works\nschema_sql = db.generate_complete_schema_sql(\"postgresql\")  # \u2705 Works\n\n# But execution currently requires PostgreSQL\nruntime = LocalRuntime()\nresults, run_id = runtime.execute(workflow.build())  # \u2705 PostgreSQL only\n```\n\n### Database Operations as Workflow Nodes\n```python\n# Traditional ORMs: Imperative code\nuser = User.objects.create(name=\"Alice\")  # Django\nuser = User(name=\"Alice\"); session.add(user)  # SQLAlchemy\n\n# DataFlow: Workflow-native (9 nodes per model!)\nworkflow = WorkflowBuilder()\nworkflow.add_node(\"UserCreateNode\", \"create_user\", {\n    \"name\": \"Alice\",\n    \"email\": \"alice@example.com\"\n})\nworkflow.add_node(\"UserListNode\", \"find_users\", {\n    \"filter\": {\"name\": {\"$like\": \"A%\"}}\n})\n```\n\n### Enterprise Configuration\n```python\n# Multi-tenancy configuration (query modification planned)\ndb = DataFlow(multi_tenant=True)\n\n# Real SQL generation with security\ndb = DataFlow(\n    database_url=\"postgresql://user:pass@localhost/db\",\n    pool_size=20,\n    pool_max_overflow=30,\n    monitoring=True,\n    echo=False  # No SQL logging in production\n)\n\n# All generated nodes use parameterized queries for security\n# INSERT INTO users (name, email) VALUES ($1, $2)  -- Safe from SQL injection\n```\n\n## \ud83d\udea6 Implementation Status\n\n### \u2705 Currently Available (Production-Ready)\n- **Database Schema Generation**: Complete CREATE TABLE for PostgreSQL, MySQL, SQLite\n- **Auto-Migration System**: PostgreSQL-only, production-ready automatic schema synchronization\n- **Real Database Operations**: All 9 CRUD + bulk nodes execute actual SQL\n- **SQL Security**: Parameterized queries prevent SQL injection\n- **Connection Management**: Connection pooling, DDL execution, error handling\n- **Workflow Integration**: Full compatibility with WorkflowBuilder/LocalRuntime\n- **Configuration System**: Zero-config to enterprise patterns\n- **MongoDB-Style Query Builder**: Complete with all operators ($eq, $gt, $in, $regex, etc.)\n- **Concurrent Access Protection**: Migration locking and atomic operations\n- **Schema State Management**: Change detection, caching, and rollback capabilities\n\n### \u26a0\ufe0f Limitations\n- **Database Runtime**: PostgreSQL execution only (schema generation works for all)\n- **AsyncSQLDatabaseNode**: Current limitation requires PostgreSQL connection string\n\n### \ud83d\udd04 Planned Features (Roadmap)\n- **Redis Query Caching**: `User.cached_query()` with automatic invalidation\n- **Multi-Database Runtime**: SQLite/MySQL execution support\n- **Advanced Multi-Tenancy**: Automatic query modification for tenant isolation\n\n## \ud83d\udcda Documentation\n\n### Getting Started\n- **[5-Minute Tutorial](docs/getting-started/quickstart.md)** - Build your first app\n- **[Core Concepts](docs/getting-started/concepts.md)** - Understand DataFlow\n- **[Examples](examples/)** - Complete applications\n\n### Development\n- **[Models](docs/development/models.md)** - Define your schema\n- **[CRUD Operations](docs/development/crud.md)** - Basic operations\n- **[Relationships](docs/development/relationships.md)** - Model associations\n\n### Production\n- **[Deployment](docs/production/deployment.md)** - Go to production\n- **[Performance](docs/production/performance.md)** - Optimization guide\n- **[Monitoring](docs/advanced/monitoring.md)** - Observability\n\n## \ud83d\udca1 Real-World Examples\n\n### E-Commerce Platform\n```python\n# Define your models\n@db.model\nclass Product:\n    id: int\n    name: str\n    price: float\n    stock: int\n\n@db.model\nclass Order:\n    id: int\n    user_id: int\n    total: float\n    status: str\n\n# Use in workflows\nworkflow = WorkflowBuilder()\n\n# Check inventory\nworkflow.add_node(\"ProductGetNode\", \"check_stock\", {\n    \"id\": \"{product_id}\"\n})\n\n# Create order with transaction\nworkflow.add_node(\"TransactionContextNode\", \"tx_start\")\nworkflow.add_node(\"OrderCreateNode\", \"create_order\", {\n    \"user_id\": \"{user_id}\",\n    \"total\": \"{total}\"\n})\nworkflow.add_node(\"ProductUpdateNode\", \"update_stock\", {\n    \"id\": \"{product_id}\",\n    \"stock\": \"{new_stock}\"\n})\n```\n\n### Multi-Tenant SaaS (Current Implementation)\n```python\n# Enable multi-tenancy configuration\ndb = DataFlow(\n    database_url=\"postgresql://user:pass@localhost/db\",\n    multi_tenant=True\n)\n\n# Multi-tenant models get tenant_id field automatically\n@db.model\nclass User:\n    name: str\n    email: str\n    # tenant_id: str automatically added\n\n# Use in workflows with real database operations\nworkflow.add_node(\"UserCreateNode\", \"create_user\", {\n    \"name\": \"Alice\",\n    \"email\": \"alice@acme-corp.com\"\n})\nworkflow.add_node(\"UserListNode\", \"list_users\", {\n    \"limit\": 10,\n    \"filter\": {}\n})\n```\n\n### High-Performance ETL (Current Implementation)\n```python\n# Bulk operations with real database execution\nworkflow.add_node(\"UserBulkCreateNode\", \"import_users\", {\n    \"data\": users_data,  # List of user records\n    \"batch_size\": 1000,\n    \"conflict_resolution\": \"skip\"\n})\n\n# Real bulk INSERT operations executed\n# Uses parameterized queries for security\n# Processes data in configurable batches\n\n# List operations with filters\nworkflow.add_node(\"UserListNode\", \"active_users\", {\n    \"limit\": 1000,\n    \"offset\": 0,\n    \"order_by\": [\"created_at\"],\n    \"filter\": {\"active\": True}\n})\n```\n\n## \ud83c\udfd7\ufe0f Architecture\n\nDataFlow seamlessly integrates with Kailash's workflow architecture:\n\n```\n\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n\u2502                 Your Application                     \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502                    DataFlow                          \u2502\n\u2502  \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510  \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510  \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510         \u2502\n\u2502  \u2502  Models  \u2502  \u2502   Nodes  \u2502  \u2502 Migrations\u2502         \u2502\n\u2502  \u2514\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2518  \u2514\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2518  \u2514\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2518         \u2502\n\u2502       \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518               \u2502\n\u2502                Core Features                         \u2502\n\u2502  QueryBuilder \u2502 QueryCache \u2502 Monitoring \u2502 Multi-tenant \u2502\n\u2502  \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510  \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510  \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510         \u2502\n\u2502  \u2502MongoDB-  \u2502  \u2502Redis     \u2502  \u2502Pattern   \u2502         \u2502\n\u2502  \u2502style     \u2502  \u2502Caching   \u2502  \u2502Invalidate\u2502         \u2502\n\u2502  \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518  \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518  \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518         \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502               Kailash SDK                           \u2502\n\u2502         Workflows \u2502 Nodes \u2502 Runtime                 \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n```\n\n## \ud83e\uddea Testing\n\nDataFlow includes comprehensive testing support:\n\n```python\n# Test with in-memory database\ndef test_user_creation():\n    db = DataFlow(testing=True)\n\n    @db.model\n    class User:\n        id: int\n        name: str\n\n    # Automatic test isolation\n    user = db.test_create(User, name=\"Test User\")\n    assert user.name == \"Test User\"\n```\n\n## \ud83e\udd1d Contributing\n\nWe welcome contributions! DataFlow follows Kailash SDK patterns:\n\n1. Use SDK components and patterns\n2. Maintain zero-config philosophy\n3. Write comprehensive tests\n4. Update documentation\n\nSee [CONTRIBUTING.md](CONTRIBUTING.md) for details.\n\n## \ud83d\udcca Performance\n\nDataFlow provides real database performance with PostgreSQL:\n\n- **Real SQL execution** with parameterized queries\n- **Connection pooling** with configurable pool sizes\n- **Bulk operations** with batching for large datasets\n- **Production-ready** database operations\n\nPerformance testing requires PostgreSQL database setup.\nAdvanced caching and query optimization features are planned.\n\n## \u26a1 Why DataFlow?\n\n- **Real Database Operations**: Actual SQL execution, not mocks\n- **Workflow-Native**: Database ops as first-class nodes\n- **Production-Ready**: PostgreSQL support with connection pooling\n- **Progressive**: Simple to start, enterprise features available\n- **100% Kailash**: Built on proven SDK components\n\n---\n\n**Built with Kailash SDK** | [Parent Project](../../README.md) | [SDK Docs](../../sdk-users/)\n",
    "bugtrack_url": null,
    "license": "Apache-2.0 WITH Additional-Terms",
    "summary": "Workflow-native database framework for Kailash SDK",
    "version": "0.3.7",
    "project_urls": {
        "Homepage": "https://github.com/Integrum-Global/kailash_python_sdk"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "e0a4840099effdfbccc1af47e0a96c26cfc082e53550b1ff1470cc58a2c5060b",
                "md5": "80e5411036647176c0b8e244ac1b1938",
                "sha256": "c73f3c90068d60a93d305dc1230598a1910abf698edf8d36309b21cc28463903"
            },
            "downloads": -1,
            "filename": "kailash_dataflow-0.3.7-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "80e5411036647176c0b8e244ac1b1938",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 314400,
            "upload_time": "2025-08-02T03:17:23",
            "upload_time_iso_8601": "2025-08-02T03:17:23.061406Z",
            "url": "https://files.pythonhosted.org/packages/e0/a4/840099effdfbccc1af47e0a96c26cfc082e53550b1ff1470cc58a2c5060b/kailash_dataflow-0.3.7-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d3f01c686ea01412cf9f9c5c306844e92e49b4c05b5af66ba7f3861450b6fb53",
                "md5": "e2e4d59de66e7ccd4a982510af39a6a1",
                "sha256": "5d2f60e87ac219ddbec624876150f89197bdef71e1e2faa160977f6e7aa42fc5"
            },
            "downloads": -1,
            "filename": "kailash_dataflow-0.3.7.tar.gz",
            "has_sig": false,
            "md5_digest": "e2e4d59de66e7ccd4a982510af39a6a1",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 272568,
            "upload_time": "2025-08-02T03:17:24",
            "upload_time_iso_8601": "2025-08-02T03:17:24.768363Z",
            "url": "https://files.pythonhosted.org/packages/d3/f0/1c686ea01412cf9f9c5c306844e92e49b4c05b5af66ba7f3861450b6fb53/kailash_dataflow-0.3.7.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-02 03:17:24",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Integrum-Global",
    "github_project": "kailash_python_sdk",
    "github_not_found": true,
    "lcname": "kailash-dataflow"
}
        
Elapsed time: 1.15876s