# jpy-sync-db-lite
Jim's Python - Synchronous Database Wrapper for SQLite
A lightweight, thread-safe SQLite database wrapper built on SQLAlchemy with optimized performance for concurrent operations.
## Features
- **Thread-safe operations** with worker thread pool
- **SQLAlchemy 2.0+ compatibility** with modern async patterns
- **Performance optimized** with SQLite-specific pragmas
- **Simple API** for common database operations
- **Consolidated operations** for both single and bulk operations
- **Batch SQL execution** for multiple statements in a single operation
- **Transaction support** for complex operations
- **Statistics tracking** for monitoring performance
- **Robust SQL parsing** using sqlparse library for reliable statement parsing
- **SQLite-specific management** with VACUUM, ANALYZE, integrity checks, and PRAGMA configuration
- **Database optimization tools** for performance tuning and maintenance
- **Enhanced error handling** with SQLite-specific exception types
## Installation
### From PyPI (when published)
```bash
pip install jpy-sync-db-lite
```
### From source
```bash
git clone https://github.com/jim-schilling/jpy-sync-db-lite.git
cd jpy-sync-db-lite
pip install -e .
```
### Development setup
```bash
git clone https://github.com/jim-schilling/jpy-sync-db-lite.git
cd jpy-sync-db-lite
pip install -e ".[dev]"
```
## Quick Start
```python
from jpy_sync_db_lite import DbEngine
db = DbEngine('sqlite:///my_database.db',
num_workers=1,
debug=False)
# Get SQLite information
sqlite_info = db.get_sqlite_info()
print(f"SQLite version: {sqlite_info['version']}")
print(f"Database size: {sqlite_info['database_size']} bytes")
# Configure SQLite settings for better performance
db.configure_pragma('cache_size', '-128000') # 128MB cache
db.configure_pragma('synchronous', 'NORMAL') # Balance speed and safety
# Create a table
db.execute("""
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY,
name TEXT NOT NULL,
email TEXT UNIQUE
)
""")
# Insert single record
db.execute(
"INSERT INTO users (name, email) VALUES (:name, :email)",
{"name": "John Doe", "email": "john@example.com"}
)
# Fetch data
users = db.fetch("SELECT * FROM users WHERE name = :name", {"name": "John Doe"})
print(users) # [{'id': 1, 'name': 'John Doe', 'email': 'john@example.com'}]
# Run SQLite maintenance operations
db.analyze() # Update query planner statistics
db.optimize() # Run optimization commands
# Check database integrity
issues = db.integrity_check()
if issues:
print(f"Integrity issues: {issues}")
else:
print("Database integrity check passed")
# Batch operations - execute multiple SQL statements
batch_sql = """
-- Create a new table
CREATE TABLE IF NOT EXISTS logs (
id INTEGER PRIMARY KEY,
message TEXT,
timestamp DATETIME DEFAULT CURRENT_TIMESTAMP
);
-- Insert multiple log entries
INSERT INTO logs (message) VALUES ('Application started');
INSERT INTO logs (message) VALUES ('User login successful');
-- Query the logs
SELECT * FROM logs ORDER BY timestamp DESC LIMIT 5;
-- Update a log entry
UPDATE logs SET message = 'Application started successfully' WHERE message = 'Application started';
"""
batch_results = db.batch(batch_sql)
print(f"Batch executed {len(batch_results)} statements")
# Optional: Run VACUUM for space reclamation (use sparingly)
# db.vacuum()
# Cleanup
db.shutdown()
```
## API Reference
### DbEngine
The main database engine class that manages connections and operations.
#### Constructor
```python
DbEngine(database_url: str, **kwargs)
```
**Parameters:**
- `database_url`: SQLAlchemy database URL (e.g., 'sqlite:///database.db')
- `num_workers`: Number of worker threads (default: 1)
- `debug`: Enable SQLAlchemy echo mode (default: False)
- `timeout`: SQLite connection timeout in seconds (default: 30)
- `check_same_thread`: SQLite thread safety check (default: False)
#### Methods
##### execute(query, params=None)
Execute a non-query SQL statement (INSERT, UPDATE, DELETE, etc.). Handles both single operations and bulk operations.
```python
# Single operation
db.execute("UPDATE users SET name = :name WHERE id = :id",
{"name": "New Name", "id": 1})
# Bulk operation
updates = [{"id": 1, "status": "active"}, {"id": 2, "status": "inactive"}]
count = db.execute("UPDATE users SET status = :status WHERE id = :id", updates)
print(f"Updated {count} users")
```
##### fetch(query, params=None)
Execute a SELECT query and return results as a list of dictionaries.
```python
results = db.fetch("SELECT * FROM users WHERE age > :min_age",
{"min_age": 18})
```
##### execute_transaction(operations)
Execute multiple operations in a single transaction.
```python
operations = [
{"operation": "execute", "query": "INSERT INTO users (name) VALUES (:name)", "params": {"name": "User1"}},
{"operation": "fetch", "query": "SELECT COUNT(*) as count FROM users"}
]
results = db.execute_transaction(operations)
```
##### get_raw_connection()
Get a raw SQLAlchemy connection for advanced operations.
```python
with db.get_raw_connection() as conn:
# Use conn for complex operations
result = conn.execute(text("SELECT * FROM users"))
```
##### get_stats()
Get database operation statistics.
```python
stats = db.get_stats()
print(f"Requests: {stats['requests']}, Errors: {stats['errors']}")
```
##### shutdown()
Gracefully shutdown the database engine and worker threads.
```python
db.shutdown()
```
##### get_sqlite_info()
Get SQLite-specific information and statistics.
```python
info = db.get_sqlite_info()
print(f"SQLite version: {info['version']}")
print(f"Database size: {info['database_size']} bytes")
print(f"Journal mode: {info['journal_mode']}")
print(f"Cache size: {info['cache_size']}")
```
**Returns:**
Dictionary containing SQLite information:
- `version`: SQLite version string
- `database_size`: Database file size in bytes (None for in-memory)
- `page_count`: Number of pages in database
- `page_size`: Page size in bytes
- `cache_size`: Current cache size
- `journal_mode`: Current journal mode (wal, delete, truncate, persist, memory, off)
- `synchronous`: Current synchronous mode (0=OFF, 1=NORMAL, 2=FULL)
- `temp_store`: Current temp store mode (0=DEFAULT, 1=FILE, 2=MEMORY)
- `mmap_size`: Memory map size in bytes
- `busy_timeout`: Busy timeout in milliseconds
##### configure_pragma(pragma_name, value)
Configure a specific SQLite PRAGMA setting.
```python
# Set cache size to 128MB
db.configure_pragma('cache_size', '-128000')
# Set synchronous mode to FULL for maximum durability
db.configure_pragma('synchronous', 'FULL')
# Set busy timeout to 60 seconds
db.configure_pragma('busy_timeout', '60000')
```
**Parameters:**
- `pragma_name`: Name of the PRAGMA (e.g., 'cache_size', 'synchronous', 'busy_timeout')
- `value`: Value to set for the PRAGMA
##### vacuum()
Perform SQLite VACUUM operation to reclaim space and optimize database.
```python
# Reclaim space and optimize database
db.vacuum()
```
**Note:** VACUUM requires exclusive access to the database and may take time for large databases.
##### analyze(table_name=None)
Update SQLite query planner statistics for better query performance.
```python
# Analyze all tables
db.analyze()
# Analyze specific table
db.analyze('users')
```
**Parameters:**
- `table_name`: Specific table to analyze (None for all tables)
##### integrity_check()
Perform SQLite integrity check and return any issues found.
```python
issues = db.integrity_check()
if issues:
print(f"Database integrity issues found: {issues}")
else:
print("Database integrity check passed")
```
**Returns:**
List of integrity issues (empty list if database is healthy)
##### optimize()
Run SQLite optimization commands for better performance.
```python
# Run optimization commands
db.optimize()
```
This method runs `PRAGMA optimize` and `ANALYZE` to improve query performance.
##### batch(batch_sql, allow_select=True)
Execute multiple SQL statements in a batch with thread safety.
```python
batch_sql = """
-- Create a table
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY,
name TEXT NOT NULL
);
-- Insert some data
INSERT INTO users (name) VALUES ('John');
INSERT INTO users (name) VALUES ('Jane');
-- Query the data (only if allow_select=True)
SELECT * FROM users;
"""
results = db.batch(batch_sql)
# Process results
for i, result in enumerate(results):
print(f"Statement {i}: {result['operation']} - {result.get('row_count', 'N/A')} rows")
if result['operation'] == 'fetch':
print(f" Data: {result['result']}")
```
**Parameters:**
- `batch_sql`: SQL string containing multiple statements separated by semicolons
- `allow_select`: If True, allows SELECT statements (default: True). If False, raises an error if SELECT statements are found
**Returns:**
List of dictionaries containing results for each statement:
- `statement`: The actual SQL statement executed
- `operation`: 'fetch', 'execute', or 'error'
- `result`: Query results (for SELECT) or True (for other operations)
- `row_count`: Number of rows affected/returned
- `error`: Error message (only for failed statements)
**Features:**
- **Robust SQL parsing** using sqlparse library for reliable statement parsing
- Automatically removes SQL comments (-- and /* */) while preserving comments within string literals
- Handles semicolons within string literals and complex SQL constructs
- Supports DDL (CREATE, ALTER, DROP) and DML (INSERT, UPDATE, DELETE) statements
- Continues execution even if individual statements fail
- Maintains transaction consistency across all statements
- Enhanced support for complex SQL constructs including triggers and BEGIN...END blocks
## Performance Optimizations
The library includes several SQLite-specific optimizations:
- **WAL mode** for better concurrency
- **Optimized cache settings** (64MB cache)
- **Memory-mapped files** (256MB)
- **Query planner optimization**
- **Static connection pooling**
## Thread Safety
All operations are thread-safe through:
- Worker thread pool for request processing
- Thread-safe queues for request/response handling
- Proper connection management per operation
- Lock-protected statistics updates
## Performance Testing
The library includes comprehensive performance testing tools to help you optimize your database operations.
### Quick Performance Check
Run the standalone benchmark script for a quick performance overview:
```bash
# Run all benchmarks
python tests/benchmark_db_engine.py
# Run specific tests
python tests/benchmark_db_engine.py --tests single bulk select scaling batch
# Customize parameters
python tests/benchmark_db_engine.py --operations 2000 --workers 2 --batch-sizes 100 500 1000
```
### Comprehensive Performance Tests
Run the full unittest suite for detailed performance analysis:
```bash
# Run all performance tests
python -m unittest tests.test_db_engine_performance -v
```
### Performance Test Categories
#### 1. Single Insert Performance
- Measures individual insert operation latency and throughput
- **Expected**: >50 ops/sec, <100ms average latency
#### 2. Bulk Insert Performance
- Tests different batch sizes (10, 50, 100, 250 records)
- Measures throughput and per-record latency
- **Expected**: >100 ops/sec for optimal batch sizes
#### 3. Select Performance
- Tests various query types:
- Simple SELECT with LIMIT
- Filtered SELECT with WHERE clauses
- Indexed SELECT using indexed columns
- Aggregate SELECT with COUNT/AVG
- Complex SELECT with multiple conditions and ORDER BY
- **Expected**: >200 ops/sec for simple selects
#### 4. Batch Performance
- Tests batch SQL execution with multiple statements
- Measures performance of mixed DDL/DML operations
- Tests different batch sizes and statement types
- **Expected**: >100 ops/sec for batch operations
#### 5. Concurrent Operations Performance
- Tests performance under concurrent load (1, 2, 4, 8 threads)
- Mix of read and write operations
- **Expected**: >50 ops/sec under load
#### 6. Transaction Performance
- Tests transaction operations with different sizes
- **Expected**: >50 ops/sec for transactions
#### 7. Worker Thread Scaling
- Tests performance with different worker thread configurations
- Helps determine optimal worker count
- **Expected**: >30 ops/sec with single worker
### Performance Metrics
The tests measure:
- **Throughput**: Operations per second (ops/sec)
- **Latency**: Time per operation in milliseconds
- **Memory Usage**: Memory consumption and growth rate (optional, requires `psutil`)
- **Concurrency Scaling**: Performance with multiple threads
### Performance Expectations
Based on SQLite with WAL mode and optimized pragmas:
| Operation Type | Expected Throughput | Expected Latency |
|----------------|-------------------|------------------|
| Single Insert | >50 ops/sec | <100ms avg |
| Bulk Insert | >100 ops/sec | <50ms per record |
| Simple Select | >200 ops/sec | <10ms avg |
| Complex Select | >50 ops/sec | <50ms avg |
| Batch Operations| >100 ops/sec | <100ms avg |
| Transactions | >50 ops/sec | <100ms avg |
| Concurrent Ops | >50 ops/sec | <100ms avg |
| Single Worker | >30 ops/sec | <100ms avg |
### Optimization Recommendations
The performance tests provide recommendations for:
- **Optimal batch sizes** for bulk operations
- **Optimal worker threads** for your workload
- **Memory efficiency** analysis
- **Scaling considerations** for concurrent operations
### Memory Monitoring (Optional)
Memory usage monitoring is optional and requires the `psutil` package:
```bash
pip install psutil
```
**Note**: `psutil` is not a dependency of this package. Without `psutil`, the tests will run normally but skip memory measurements.
### Performance Troubleshooting
Common performance issues and solutions:
1. **Low throughput**: Use batch operations, optimize worker count
2. **High latency**: Check for blocking operations, monitor system resources
3. **Memory growth**: Look for unclosed connections or large result sets
4. **Concurrency issues**: SQLite has limitations with concurrent writes
For detailed performance analysis, see [tests/PERFORMANCE_TESTS.md](tests/PERFORMANCE_TESTS.md).
## Development
### Running Tests
```bash
pytest
```
## License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## Contributing
1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests for new functionality
5. Ensure all tests pass
6. Submit a pull request
## Changelog
### 0.3.1 (2025-07-11)
- **Dead code elimination** with removal of unused constants, methods, and imports from the database engine
- **Code cleanup** with removal of `_BATCH_STATEMENT`, `_SUCCESS`, `_ERROR`, and `_ERROR_COMMIT_FAILED` unused constants
- **Method cleanup** with removal of unused `_acquire_db_lock` context manager method (~45 lines of dead code)
- **Import optimization** with removal of unused `time` import from db_engine.py
- **Code maintainability improvements** with elimination of ~50 lines of unused code
- **Enhanced code quality** with cleaner, more focused database engine implementation
- **Better code organization** with removal of redundant and unused code elements
### 0.3.0 (2025-07-07)
- **Comprehensive test suite cleanup and optimization** with removal of all debug and extraneous print statements from test files
- **Enhanced SQL helper test coverage** with 95 comprehensive tests covering edge cases, error handling, and boundary conditions
- **Improved SQL statement type detection** with robust CTE (Common Table Expression) parsing and handling
- **Enhanced SQL parsing robustness** with better handling of invalid SQL statements and edge cases
- **Comprehensive edge case testing** for SQL helper functions including malformed SQL, nested comments, and complex CTE scenarios
- **Performance testing improvements** with optimized test execution and better coverage of SQL parsing performance
- **Enhanced error handling** for SQL parsing edge cases including incomplete comments, malformed statements, and invalid file paths
- **Improved test maintainability** with cleaner test structure and removal of debug output
- **Better SQL statement type detection** for complex scenarios including:
- CTEs with no main statement (invalid SQL handling)
- Multiple CTEs with complex nesting
- CTEs with unknown statement types after them
- Complex parentheses and nested structures in CTEs
- Window functions, JSON operations, and recursive CTEs
- **Enhanced SQL parsing edge cases** including:
- Empty statements and whitespace-only input
- Statements with only comments
- Malformed SQL with unclosed strings or comments
- Very long SQL statements and complex nesting
- String literals containing SQL keywords or semicolons
- **Improved file handling** for SQL file operations with comprehensive error handling for invalid paths and file operations
- **Enhanced integration testing** with full SQL processing pipeline tests and batch processing scenarios
- **Better test categorization** with unit, integration, performance, and coverage test classifications
- **Comprehensive performance benchmarking** for SQL parsing operations with realistic workload testing
- **Code quality improvements** with 90% test coverage for sql_helper.py and robust error handling patterns
- **Documentation updates** reflecting current test coverage and API behavior expectations
### 0.2.7 (2025-06-29)
- **Enhanced project configuration** with updated setuptools and setuptools-scm for better version management
- **Improved dependency management** with specific version constraints for all development and testing dependencies
- **Enhanced development tooling** with comprehensive linting, formatting, and type checking configurations (ruff, black, isort, mypy, bandit)
- **Better test infrastructure** with enhanced pytest configuration, coverage reporting, and test categorization
- **Documentation improvements** with updated API examples and corrected return type documentation for batch operations
- **Code quality enhancements** with improved logging and error handling in SQLite operations
- **Enhanced test coverage** for performance and integration scenarios with robust validation of new features
- **Project metadata improvements** with additional classifiers, keywords, and better package discovery
### 0.2.6 (2025-06-29)
- **Enhanced input validation for `split_sql_file()` function** with proper handling of invalid path types
- **Improved error handling** for `None`, empty strings, and non-string/non-Path objects in file path parameters
- **Better type safety** with explicit validation of file path parameters before processing
- **Consistent error messaging** with descriptive ValueError messages for invalid inputs
- **Enhanced robustness** of SQL file processing with comprehensive input validation
- **Test coverage improvements** with edge case testing for invalid file path scenarios
### 0.2.5 (2025-06-28)
- **Enhanced error handling for database maintenance operations** with proper exception wrapping and rollback support
- **Improved robustness of maintenance methods** (`vacuum`, `analyze`, `integrity_check`, `optimize`) with try-catch blocks
- **Better error messages** for maintenance operations with descriptive failure descriptions
- **Comprehensive test coverage** for error handling scenarios in maintenance operations
- **Consistent error handling patterns** across all database maintenance methods
- **Enhanced SQLite-specific functionality** with comprehensive database management features
- **New `get_sqlite_info()` method** to retrieve SQLite version, database statistics, and PRAGMA values
- **New `configure_pragma()` method** for dynamic SQLite PRAGMA configuration (cache_size, synchronous, etc.)
- **New `vacuum()` method** for database space reclamation and optimization
- **New `analyze()` method** for updating query planner statistics (all tables or specific table)
- **New `integrity_check()` method** for database integrity verification
- **New `optimize()` method** for running SQLite optimization commands
- **Enhanced engine configuration** with SQLite-specific connection parameters (timeout, check_same_thread)
- **Improved transaction support** with proper isolation level configuration (DEFERRED mode)
- **Enhanced performance configuration** with additional SQLite pragmas (foreign_keys, busy_timeout, auto_vacuum)
- **Comprehensive SQLite-specific test suite** with 16 new test methods covering all new functionality
- **Better error handling** with SQLiteError exception class for SQLite-specific errors
- **Documentation updates** with complete API reference for all new SQLite-specific methods
- **Performance optimizations** with enhanced SQLite pragma settings for better concurrency and reliability
### 0.2.4 (2025-06-27)
- **Test suite refactoring** with removal of private function tests to focus on public API testing
- **Improved test maintainability** by eliminating tests for internal implementation details
- **Enhanced nested comment handling** in SQL parsing with more realistic expectations
- **Better test coverage** focusing on public interface behavior rather than implementation details
- **Code quality improvements** with cleaner test structure and more maintainable test suite
- **Documentation updates** reflecting current test coverage and API expectations
### 0.2.3 (2025-06-27)
- **Enhanced thread safety and concurrency** with improved locking mechanisms and connection management
- **Optimized database engine performance** with refined worker thread handling and request processing
- **Improved SQL statement parsing** with better support for complex SQL constructs and edge cases
- **Enhanced error handling and recovery** with more robust exception management and detailed error reporting
- **Code quality improvements** with comprehensive test coverage and performance benchmarking
- **Memory usage optimizations** with better resource management and cleanup procedures
- **Documentation enhancements** with improved API documentation and usage examples
### 0.2.2 (2025-06-26)
- **Code refactoring and architectural improvements** for better maintainability and performance
- **Enhanced error handling and logging** with more detailed exception information
- **Optimized database performance** with refined SQLite pragma configurations
- **Enhanced SQL parsing robustness** with better handling of edge cases and malformed SQL
- **Code documentation improvements** with more detailed docstrings and usage examples
### 0.2.1 (2025-06-26)
- **Refactored SQL parsing to use sqlparse library** for improved reliability and standards compliance
- **Enhanced SQL comment removal** with proper handling of comments within string literals
- **Improved SQL statement parsing** with better handling of complex SQL constructs including BEGIN...END blocks
- **Added sqlparse dependency** for robust SQL parsing and formatting
- **Improved error handling** for malformed SQL statements
- **Better support for complex SQL constructs** including triggers, stored procedures, and multi-line statements
### 0.2.0 (2025-06-25)
- **New batch SQL execution feature** for executing multiple SQL statements in a single operation
- **SQL statement parsing and validation** with automatic comment removal
- **Enhanced error handling** for batch operations with individual statement error reporting
- **Thread-safe batch processing** with proper connection management
- **Support for mixed DDL/DML operations** in batch mode
- **Automatic semicolon handling** within string literals and BEGIN...END blocks
- **Batch performance testing** and benchmarking tools
- **Improved SQL validation** with comprehensive statement type checking
- **Enhanced documentation** with batch operation examples and API reference
### 0.1.3 (2025-06-23)
- Thread-safe SQLite operations with worker thread pool
- SQLAlchemy 2.0+ compatibility with modern async patterns
- Performance optimizations with SQLite-specific pragmas
- Consolidated API with `execute()` method handling both single and bulk operations
- Transaction support for complex operations
- Statistics tracking for monitoring performance
- Extensive performance testing suite with benchmarks
- Memory usage monitoring (optional, requires `psutil`)
- Thread safety through proper connection management
- WAL mode and optimized cache settings for better concurrency
Raw data
{
"_id": null,
"home_page": null,
"name": "jpy-sync-db-lite",
"maintainer": "Jim Schilling",
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "sqlite, database, synchronous, threading, sqlalchemy, orm",
"author": "Jim Schilling",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/19/1a/6ea9afeb4afb46f825549eb111592438eea1c649676554725c513aed125a/jpy_sync_db_lite-0.3.1.tar.gz",
"platform": null,
"description": "# jpy-sync-db-lite\r\n\r\nJim's Python - Synchronous Database Wrapper for SQLite\r\n\r\nA lightweight, thread-safe SQLite database wrapper built on SQLAlchemy with optimized performance for concurrent operations.\r\n\r\n## Features\r\n\r\n- **Thread-safe operations** with worker thread pool\r\n- **SQLAlchemy 2.0+ compatibility** with modern async patterns\r\n- **Performance optimized** with SQLite-specific pragmas\r\n- **Simple API** for common database operations\r\n- **Consolidated operations** for both single and bulk operations\r\n- **Batch SQL execution** for multiple statements in a single operation\r\n- **Transaction support** for complex operations\r\n- **Statistics tracking** for monitoring performance\r\n- **Robust SQL parsing** using sqlparse library for reliable statement parsing\r\n- **SQLite-specific management** with VACUUM, ANALYZE, integrity checks, and PRAGMA configuration\r\n- **Database optimization tools** for performance tuning and maintenance\r\n- **Enhanced error handling** with SQLite-specific exception types\r\n\r\n## Installation\r\n\r\n### From PyPI (when published)\r\n```bash\r\npip install jpy-sync-db-lite\r\n```\r\n\r\n### From source\r\n```bash\r\ngit clone https://github.com/jim-schilling/jpy-sync-db-lite.git\r\ncd jpy-sync-db-lite\r\npip install -e .\r\n```\r\n\r\n### Development setup\r\n```bash\r\ngit clone https://github.com/jim-schilling/jpy-sync-db-lite.git\r\ncd jpy-sync-db-lite\r\npip install -e \".[dev]\"\r\n```\r\n\r\n## Quick Start\r\n\r\n```python\r\nfrom jpy_sync_db_lite import DbEngine\r\n\r\ndb = DbEngine('sqlite:///my_database.db', \r\n num_workers=1, \r\n debug=False)\r\n\r\n# Get SQLite information\r\nsqlite_info = db.get_sqlite_info()\r\nprint(f\"SQLite version: {sqlite_info['version']}\")\r\nprint(f\"Database size: {sqlite_info['database_size']} bytes\")\r\n\r\n# Configure SQLite settings for better performance\r\ndb.configure_pragma('cache_size', '-128000') # 128MB cache\r\ndb.configure_pragma('synchronous', 'NORMAL') # Balance speed and safety\r\n\r\n# Create a table\r\ndb.execute(\"\"\"\r\n CREATE TABLE IF NOT EXISTS users (\r\n id INTEGER PRIMARY KEY,\r\n name TEXT NOT NULL,\r\n email TEXT UNIQUE\r\n )\r\n\"\"\")\r\n\r\n# Insert single record\r\ndb.execute(\r\n \"INSERT INTO users (name, email) VALUES (:name, :email)\",\r\n {\"name\": \"John Doe\", \"email\": \"john@example.com\"}\r\n)\r\n\r\n# Fetch data\r\nusers = db.fetch(\"SELECT * FROM users WHERE name = :name\", {\"name\": \"John Doe\"})\r\nprint(users) # [{'id': 1, 'name': 'John Doe', 'email': 'john@example.com'}]\r\n\r\n# Run SQLite maintenance operations\r\ndb.analyze() # Update query planner statistics\r\ndb.optimize() # Run optimization commands\r\n\r\n# Check database integrity\r\nissues = db.integrity_check()\r\nif issues:\r\n print(f\"Integrity issues: {issues}\")\r\nelse:\r\n print(\"Database integrity check passed\")\r\n\r\n# Batch operations - execute multiple SQL statements\r\nbatch_sql = \"\"\"\r\n -- Create a new table\r\n CREATE TABLE IF NOT EXISTS logs (\r\n id INTEGER PRIMARY KEY,\r\n message TEXT,\r\n timestamp DATETIME DEFAULT CURRENT_TIMESTAMP\r\n );\r\n \r\n -- Insert multiple log entries\r\n INSERT INTO logs (message) VALUES ('Application started');\r\n INSERT INTO logs (message) VALUES ('User login successful');\r\n \r\n -- Query the logs\r\n SELECT * FROM logs ORDER BY timestamp DESC LIMIT 5;\r\n \r\n -- Update a log entry\r\n UPDATE logs SET message = 'Application started successfully' WHERE message = 'Application started';\r\n\"\"\"\r\nbatch_results = db.batch(batch_sql)\r\nprint(f\"Batch executed {len(batch_results)} statements\")\r\n\r\n# Optional: Run VACUUM for space reclamation (use sparingly)\r\n# db.vacuum()\r\n\r\n# Cleanup\r\ndb.shutdown()\r\n```\r\n\r\n## API Reference\r\n\r\n### DbEngine\r\n\r\nThe main database engine class that manages connections and operations.\r\n\r\n#### Constructor\r\n\r\n```python\r\nDbEngine(database_url: str, **kwargs)\r\n```\r\n\r\n**Parameters:**\r\n- `database_url`: SQLAlchemy database URL (e.g., 'sqlite:///database.db')\r\n- `num_workers`: Number of worker threads (default: 1)\r\n- `debug`: Enable SQLAlchemy echo mode (default: False)\r\n- `timeout`: SQLite connection timeout in seconds (default: 30)\r\n- `check_same_thread`: SQLite thread safety check (default: False)\r\n\r\n#### Methods\r\n\r\n##### execute(query, params=None)\r\nExecute a non-query SQL statement (INSERT, UPDATE, DELETE, etc.). Handles both single operations and bulk operations.\r\n\r\n```python\r\n# Single operation\r\ndb.execute(\"UPDATE users SET name = :name WHERE id = :id\", \r\n {\"name\": \"New Name\", \"id\": 1})\r\n\r\n# Bulk operation\r\nupdates = [{\"id\": 1, \"status\": \"active\"}, {\"id\": 2, \"status\": \"inactive\"}]\r\ncount = db.execute(\"UPDATE users SET status = :status WHERE id = :id\", updates)\r\nprint(f\"Updated {count} users\")\r\n```\r\n\r\n##### fetch(query, params=None)\r\nExecute a SELECT query and return results as a list of dictionaries.\r\n\r\n```python\r\nresults = db.fetch(\"SELECT * FROM users WHERE age > :min_age\", \r\n {\"min_age\": 18})\r\n```\r\n\r\n##### execute_transaction(operations)\r\nExecute multiple operations in a single transaction.\r\n\r\n```python\r\noperations = [\r\n {\"operation\": \"execute\", \"query\": \"INSERT INTO users (name) VALUES (:name)\", \"params\": {\"name\": \"User1\"}},\r\n {\"operation\": \"fetch\", \"query\": \"SELECT COUNT(*) as count FROM users\"}\r\n]\r\nresults = db.execute_transaction(operations)\r\n```\r\n\r\n##### get_raw_connection()\r\nGet a raw SQLAlchemy connection for advanced operations.\r\n\r\n```python\r\nwith db.get_raw_connection() as conn:\r\n # Use conn for complex operations\r\n result = conn.execute(text(\"SELECT * FROM users\"))\r\n```\r\n\r\n##### get_stats()\r\nGet database operation statistics.\r\n\r\n```python\r\nstats = db.get_stats()\r\nprint(f\"Requests: {stats['requests']}, Errors: {stats['errors']}\")\r\n```\r\n\r\n##### shutdown()\r\nGracefully shutdown the database engine and worker threads.\r\n\r\n```python\r\ndb.shutdown()\r\n```\r\n\r\n##### get_sqlite_info()\r\nGet SQLite-specific information and statistics.\r\n\r\n```python\r\ninfo = db.get_sqlite_info()\r\nprint(f\"SQLite version: {info['version']}\")\r\nprint(f\"Database size: {info['database_size']} bytes\")\r\nprint(f\"Journal mode: {info['journal_mode']}\")\r\nprint(f\"Cache size: {info['cache_size']}\")\r\n```\r\n\r\n**Returns:**\r\nDictionary containing SQLite information:\r\n- `version`: SQLite version string\r\n- `database_size`: Database file size in bytes (None for in-memory)\r\n- `page_count`: Number of pages in database\r\n- `page_size`: Page size in bytes\r\n- `cache_size`: Current cache size\r\n- `journal_mode`: Current journal mode (wal, delete, truncate, persist, memory, off)\r\n- `synchronous`: Current synchronous mode (0=OFF, 1=NORMAL, 2=FULL)\r\n- `temp_store`: Current temp store mode (0=DEFAULT, 1=FILE, 2=MEMORY)\r\n- `mmap_size`: Memory map size in bytes\r\n- `busy_timeout`: Busy timeout in milliseconds\r\n\r\n##### configure_pragma(pragma_name, value)\r\nConfigure a specific SQLite PRAGMA setting.\r\n\r\n```python\r\n# Set cache size to 128MB\r\ndb.configure_pragma('cache_size', '-128000')\r\n\r\n# Set synchronous mode to FULL for maximum durability\r\ndb.configure_pragma('synchronous', 'FULL')\r\n\r\n# Set busy timeout to 60 seconds\r\ndb.configure_pragma('busy_timeout', '60000')\r\n```\r\n\r\n**Parameters:**\r\n- `pragma_name`: Name of the PRAGMA (e.g., 'cache_size', 'synchronous', 'busy_timeout')\r\n- `value`: Value to set for the PRAGMA\r\n\r\n##### vacuum()\r\nPerform SQLite VACUUM operation to reclaim space and optimize database.\r\n\r\n```python\r\n# Reclaim space and optimize database\r\ndb.vacuum()\r\n```\r\n\r\n**Note:** VACUUM requires exclusive access to the database and may take time for large databases.\r\n\r\n##### analyze(table_name=None)\r\nUpdate SQLite query planner statistics for better query performance.\r\n\r\n```python\r\n# Analyze all tables\r\ndb.analyze()\r\n\r\n# Analyze specific table\r\ndb.analyze('users')\r\n```\r\n\r\n**Parameters:**\r\n- `table_name`: Specific table to analyze (None for all tables)\r\n\r\n##### integrity_check()\r\nPerform SQLite integrity check and return any issues found.\r\n\r\n```python\r\nissues = db.integrity_check()\r\nif issues:\r\n print(f\"Database integrity issues found: {issues}\")\r\nelse:\r\n print(\"Database integrity check passed\")\r\n```\r\n\r\n**Returns:**\r\nList of integrity issues (empty list if database is healthy)\r\n\r\n##### optimize()\r\nRun SQLite optimization commands for better performance.\r\n\r\n```python\r\n# Run optimization commands\r\ndb.optimize()\r\n```\r\n\r\nThis method runs `PRAGMA optimize` and `ANALYZE` to improve query performance.\r\n\r\n##### batch(batch_sql, allow_select=True)\r\nExecute multiple SQL statements in a batch with thread safety.\r\n\r\n```python\r\nbatch_sql = \"\"\"\r\n -- Create a table\r\n CREATE TABLE IF NOT EXISTS users (\r\n id INTEGER PRIMARY KEY,\r\n name TEXT NOT NULL\r\n );\r\n \r\n -- Insert some data\r\n INSERT INTO users (name) VALUES ('John');\r\n INSERT INTO users (name) VALUES ('Jane');\r\n \r\n -- Query the data (only if allow_select=True)\r\n SELECT * FROM users;\r\n\"\"\"\r\nresults = db.batch(batch_sql)\r\n\r\n# Process results\r\nfor i, result in enumerate(results):\r\n print(f\"Statement {i}: {result['operation']} - {result.get('row_count', 'N/A')} rows\")\r\n if result['operation'] == 'fetch':\r\n print(f\" Data: {result['result']}\")\r\n```\r\n\r\n**Parameters:**\r\n- `batch_sql`: SQL string containing multiple statements separated by semicolons\r\n- `allow_select`: If True, allows SELECT statements (default: True). If False, raises an error if SELECT statements are found\r\n\r\n**Returns:**\r\nList of dictionaries containing results for each statement:\r\n- `statement`: The actual SQL statement executed\r\n- `operation`: 'fetch', 'execute', or 'error'\r\n- `result`: Query results (for SELECT) or True (for other operations)\r\n- `row_count`: Number of rows affected/returned\r\n- `error`: Error message (only for failed statements)\r\n\r\n**Features:**\r\n- **Robust SQL parsing** using sqlparse library for reliable statement parsing\r\n- Automatically removes SQL comments (-- and /* */) while preserving comments within string literals\r\n- Handles semicolons within string literals and complex SQL constructs\r\n- Supports DDL (CREATE, ALTER, DROP) and DML (INSERT, UPDATE, DELETE) statements\r\n- Continues execution even if individual statements fail\r\n- Maintains transaction consistency across all statements\r\n- Enhanced support for complex SQL constructs including triggers and BEGIN...END blocks\r\n\r\n## Performance Optimizations\r\n\r\nThe library includes several SQLite-specific optimizations:\r\n\r\n- **WAL mode** for better concurrency\r\n- **Optimized cache settings** (64MB cache)\r\n- **Memory-mapped files** (256MB)\r\n- **Query planner optimization**\r\n- **Static connection pooling**\r\n\r\n## Thread Safety\r\n\r\nAll operations are thread-safe through:\r\n- Worker thread pool for request processing\r\n- Thread-safe queues for request/response handling\r\n- Proper connection management per operation\r\n- Lock-protected statistics updates\r\n\r\n## Performance Testing\r\n\r\nThe library includes comprehensive performance testing tools to help you optimize your database operations.\r\n\r\n### Quick Performance Check\r\n\r\nRun the standalone benchmark script for a quick performance overview:\r\n\r\n```bash\r\n# Run all benchmarks\r\npython tests/benchmark_db_engine.py\r\n\r\n# Run specific tests\r\npython tests/benchmark_db_engine.py --tests single bulk select scaling batch\r\n\r\n# Customize parameters\r\npython tests/benchmark_db_engine.py --operations 2000 --workers 2 --batch-sizes 100 500 1000\r\n```\r\n\r\n### Comprehensive Performance Tests\r\n\r\nRun the full unittest suite for detailed performance analysis:\r\n\r\n```bash\r\n# Run all performance tests\r\npython -m unittest tests.test_db_engine_performance -v\r\n```\r\n\r\n### Performance Test Categories\r\n\r\n#### 1. Single Insert Performance\r\n- Measures individual insert operation latency and throughput\r\n- **Expected**: >50 ops/sec, <100ms average latency\r\n\r\n#### 2. Bulk Insert Performance\r\n- Tests different batch sizes (10, 50, 100, 250 records)\r\n- Measures throughput and per-record latency\r\n- **Expected**: >100 ops/sec for optimal batch sizes\r\n\r\n#### 3. Select Performance\r\n- Tests various query types:\r\n - Simple SELECT with LIMIT\r\n - Filtered SELECT with WHERE clauses\r\n - Indexed SELECT using indexed columns\r\n - Aggregate SELECT with COUNT/AVG\r\n - Complex SELECT with multiple conditions and ORDER BY\r\n- **Expected**: >200 ops/sec for simple selects\r\n\r\n#### 4. Batch Performance\r\n- Tests batch SQL execution with multiple statements\r\n- Measures performance of mixed DDL/DML operations\r\n- Tests different batch sizes and statement types\r\n- **Expected**: >100 ops/sec for batch operations\r\n\r\n#### 5. Concurrent Operations Performance\r\n- Tests performance under concurrent load (1, 2, 4, 8 threads)\r\n- Mix of read and write operations\r\n- **Expected**: >50 ops/sec under load\r\n\r\n#### 6. Transaction Performance\r\n- Tests transaction operations with different sizes\r\n- **Expected**: >50 ops/sec for transactions\r\n\r\n#### 7. Worker Thread Scaling\r\n- Tests performance with different worker thread configurations\r\n- Helps determine optimal worker count\r\n- **Expected**: >30 ops/sec with single worker\r\n\r\n### Performance Metrics\r\n\r\nThe tests measure:\r\n\r\n- **Throughput**: Operations per second (ops/sec)\r\n- **Latency**: Time per operation in milliseconds\r\n- **Memory Usage**: Memory consumption and growth rate (optional, requires `psutil`)\r\n- **Concurrency Scaling**: Performance with multiple threads\r\n\r\n### Performance Expectations\r\n\r\nBased on SQLite with WAL mode and optimized pragmas:\r\n\r\n| Operation Type | Expected Throughput | Expected Latency |\r\n|----------------|-------------------|------------------|\r\n| Single Insert | >50 ops/sec | <100ms avg |\r\n| Bulk Insert | >100 ops/sec | <50ms per record |\r\n| Simple Select | >200 ops/sec | <10ms avg |\r\n| Complex Select | >50 ops/sec | <50ms avg |\r\n| Batch Operations| >100 ops/sec | <100ms avg |\r\n| Transactions | >50 ops/sec | <100ms avg |\r\n| Concurrent Ops | >50 ops/sec | <100ms avg |\r\n| Single Worker | >30 ops/sec | <100ms avg |\r\n\r\n### Optimization Recommendations\r\n\r\nThe performance tests provide recommendations for:\r\n- **Optimal batch sizes** for bulk operations\r\n- **Optimal worker threads** for your workload\r\n- **Memory efficiency** analysis\r\n- **Scaling considerations** for concurrent operations\r\n\r\n### Memory Monitoring (Optional)\r\n\r\nMemory usage monitoring is optional and requires the `psutil` package:\r\n\r\n```bash\r\npip install psutil\r\n```\r\n\r\n**Note**: `psutil` is not a dependency of this package. Without `psutil`, the tests will run normally but skip memory measurements.\r\n\r\n### Performance Troubleshooting\r\n\r\nCommon performance issues and solutions:\r\n\r\n1. **Low throughput**: Use batch operations, optimize worker count\r\n2. **High latency**: Check for blocking operations, monitor system resources\r\n3. **Memory growth**: Look for unclosed connections or large result sets\r\n4. **Concurrency issues**: SQLite has limitations with concurrent writes\r\n\r\nFor detailed performance analysis, see [tests/PERFORMANCE_TESTS.md](tests/PERFORMANCE_TESTS.md).\r\n\r\n## Development\r\n\r\n### Running Tests\r\n```bash\r\npytest\r\n```\r\n\r\n## License\r\n\r\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\r\n\r\n## Contributing\r\n\r\n1. Fork the repository\r\n2. Create a feature branch\r\n3. Make your changes\r\n4. Add tests for new functionality\r\n5. Ensure all tests pass\r\n6. Submit a pull request\r\n\r\n## Changelog\r\n\r\n### 0.3.1 (2025-07-11)\r\n- **Dead code elimination** with removal of unused constants, methods, and imports from the database engine\r\n- **Code cleanup** with removal of `_BATCH_STATEMENT`, `_SUCCESS`, `_ERROR`, and `_ERROR_COMMIT_FAILED` unused constants\r\n- **Method cleanup** with removal of unused `_acquire_db_lock` context manager method (~45 lines of dead code)\r\n- **Import optimization** with removal of unused `time` import from db_engine.py\r\n- **Code maintainability improvements** with elimination of ~50 lines of unused code\r\n- **Enhanced code quality** with cleaner, more focused database engine implementation\r\n- **Better code organization** with removal of redundant and unused code elements\r\n\r\n### 0.3.0 (2025-07-07)\r\n- **Comprehensive test suite cleanup and optimization** with removal of all debug and extraneous print statements from test files\r\n- **Enhanced SQL helper test coverage** with 95 comprehensive tests covering edge cases, error handling, and boundary conditions\r\n- **Improved SQL statement type detection** with robust CTE (Common Table Expression) parsing and handling\r\n- **Enhanced SQL parsing robustness** with better handling of invalid SQL statements and edge cases\r\n- **Comprehensive edge case testing** for SQL helper functions including malformed SQL, nested comments, and complex CTE scenarios\r\n- **Performance testing improvements** with optimized test execution and better coverage of SQL parsing performance\r\n- **Enhanced error handling** for SQL parsing edge cases including incomplete comments, malformed statements, and invalid file paths\r\n- **Improved test maintainability** with cleaner test structure and removal of debug output\r\n- **Better SQL statement type detection** for complex scenarios including:\r\n - CTEs with no main statement (invalid SQL handling)\r\n - Multiple CTEs with complex nesting\r\n - CTEs with unknown statement types after them\r\n - Complex parentheses and nested structures in CTEs\r\n - Window functions, JSON operations, and recursive CTEs\r\n- **Enhanced SQL parsing edge cases** including:\r\n - Empty statements and whitespace-only input\r\n - Statements with only comments\r\n - Malformed SQL with unclosed strings or comments\r\n - Very long SQL statements and complex nesting\r\n - String literals containing SQL keywords or semicolons\r\n- **Improved file handling** for SQL file operations with comprehensive error handling for invalid paths and file operations\r\n- **Enhanced integration testing** with full SQL processing pipeline tests and batch processing scenarios\r\n- **Better test categorization** with unit, integration, performance, and coverage test classifications\r\n- **Comprehensive performance benchmarking** for SQL parsing operations with realistic workload testing\r\n- **Code quality improvements** with 90% test coverage for sql_helper.py and robust error handling patterns\r\n- **Documentation updates** reflecting current test coverage and API behavior expectations\r\n\r\n### 0.2.7 (2025-06-29)\r\n- **Enhanced project configuration** with updated setuptools and setuptools-scm for better version management\r\n- **Improved dependency management** with specific version constraints for all development and testing dependencies\r\n- **Enhanced development tooling** with comprehensive linting, formatting, and type checking configurations (ruff, black, isort, mypy, bandit)\r\n- **Better test infrastructure** with enhanced pytest configuration, coverage reporting, and test categorization\r\n- **Documentation improvements** with updated API examples and corrected return type documentation for batch operations\r\n- **Code quality enhancements** with improved logging and error handling in SQLite operations\r\n- **Enhanced test coverage** for performance and integration scenarios with robust validation of new features\r\n- **Project metadata improvements** with additional classifiers, keywords, and better package discovery\r\n\r\n### 0.2.6 (2025-06-29)\r\n- **Enhanced input validation for `split_sql_file()` function** with proper handling of invalid path types\r\n- **Improved error handling** for `None`, empty strings, and non-string/non-Path objects in file path parameters\r\n- **Better type safety** with explicit validation of file path parameters before processing\r\n- **Consistent error messaging** with descriptive ValueError messages for invalid inputs\r\n- **Enhanced robustness** of SQL file processing with comprehensive input validation\r\n- **Test coverage improvements** with edge case testing for invalid file path scenarios\r\n\r\n### 0.2.5 (2025-06-28)\r\n- **Enhanced error handling for database maintenance operations** with proper exception wrapping and rollback support\r\n- **Improved robustness of maintenance methods** (`vacuum`, `analyze`, `integrity_check`, `optimize`) with try-catch blocks\r\n- **Better error messages** for maintenance operations with descriptive failure descriptions\r\n- **Comprehensive test coverage** for error handling scenarios in maintenance operations\r\n- **Consistent error handling patterns** across all database maintenance methods\r\n- **Enhanced SQLite-specific functionality** with comprehensive database management features\r\n- **New `get_sqlite_info()` method** to retrieve SQLite version, database statistics, and PRAGMA values\r\n- **New `configure_pragma()` method** for dynamic SQLite PRAGMA configuration (cache_size, synchronous, etc.)\r\n- **New `vacuum()` method** for database space reclamation and optimization\r\n- **New `analyze()` method** for updating query planner statistics (all tables or specific table)\r\n- **New `integrity_check()` method** for database integrity verification\r\n- **New `optimize()` method** for running SQLite optimization commands\r\n- **Enhanced engine configuration** with SQLite-specific connection parameters (timeout, check_same_thread)\r\n- **Improved transaction support** with proper isolation level configuration (DEFERRED mode)\r\n- **Enhanced performance configuration** with additional SQLite pragmas (foreign_keys, busy_timeout, auto_vacuum)\r\n- **Comprehensive SQLite-specific test suite** with 16 new test methods covering all new functionality\r\n- **Better error handling** with SQLiteError exception class for SQLite-specific errors\r\n- **Documentation updates** with complete API reference for all new SQLite-specific methods\r\n- **Performance optimizations** with enhanced SQLite pragma settings for better concurrency and reliability\r\n\r\n### 0.2.4 (2025-06-27)\r\n- **Test suite refactoring** with removal of private function tests to focus on public API testing\r\n- **Improved test maintainability** by eliminating tests for internal implementation details\r\n- **Enhanced nested comment handling** in SQL parsing with more realistic expectations\r\n- **Better test coverage** focusing on public interface behavior rather than implementation details\r\n- **Code quality improvements** with cleaner test structure and more maintainable test suite\r\n- **Documentation updates** reflecting current test coverage and API expectations\r\n\r\n### 0.2.3 (2025-06-27)\r\n- **Enhanced thread safety and concurrency** with improved locking mechanisms and connection management\r\n- **Optimized database engine performance** with refined worker thread handling and request processing\r\n- **Improved SQL statement parsing** with better support for complex SQL constructs and edge cases\r\n- **Enhanced error handling and recovery** with more robust exception management and detailed error reporting\r\n- **Code quality improvements** with comprehensive test coverage and performance benchmarking\r\n- **Memory usage optimizations** with better resource management and cleanup procedures\r\n- **Documentation enhancements** with improved API documentation and usage examples\r\n\r\n### 0.2.2 (2025-06-26)\r\n- **Code refactoring and architectural improvements** for better maintainability and performance\r\n- **Enhanced error handling and logging** with more detailed exception information\r\n- **Optimized database performance** with refined SQLite pragma configurations\r\n- **Enhanced SQL parsing robustness** with better handling of edge cases and malformed SQL\r\n- **Code documentation improvements** with more detailed docstrings and usage examples\r\n\r\n### 0.2.1 (2025-06-26)\r\n- **Refactored SQL parsing to use sqlparse library** for improved reliability and standards compliance\r\n- **Enhanced SQL comment removal** with proper handling of comments within string literals\r\n- **Improved SQL statement parsing** with better handling of complex SQL constructs including BEGIN...END blocks\r\n- **Added sqlparse dependency** for robust SQL parsing and formatting\r\n- **Improved error handling** for malformed SQL statements\r\n- **Better support for complex SQL constructs** including triggers, stored procedures, and multi-line statements\r\n\r\n### 0.2.0 (2025-06-25)\r\n- **New batch SQL execution feature** for executing multiple SQL statements in a single operation\r\n- **SQL statement parsing and validation** with automatic comment removal\r\n- **Enhanced error handling** for batch operations with individual statement error reporting\r\n- **Thread-safe batch processing** with proper connection management\r\n- **Support for mixed DDL/DML operations** in batch mode\r\n- **Automatic semicolon handling** within string literals and BEGIN...END blocks\r\n- **Batch performance testing** and benchmarking tools\r\n- **Improved SQL validation** with comprehensive statement type checking\r\n- **Enhanced documentation** with batch operation examples and API reference\r\n\r\n### 0.1.3 (2025-06-23)\r\n- Thread-safe SQLite operations with worker thread pool\r\n- SQLAlchemy 2.0+ compatibility with modern async patterns\r\n- Performance optimizations with SQLite-specific pragmas\r\n- Consolidated API with `execute()` method handling both single and bulk operations\r\n- Transaction support for complex operations\r\n- Statistics tracking for monitoring performance\r\n- Extensive performance testing suite with benchmarks\r\n- Memory usage monitoring (optional, requires `psutil`)\r\n- Thread safety through proper connection management\r\n- WAL mode and optimized cache settings for better concurrency\r\n\r\n",
"bugtrack_url": null,
"license": null,
"summary": "Jim's Python - Synchronous Database Wrapper for SQLite",
"version": "0.3.1",
"project_urls": {
"Bug Tracker": "https://github.com/jim-schilling/jpy-sync-db-lite/issues",
"Changelog": "https://github.com/jim-schilling/jpy-sync-db-lite/blob/main/CHANGELOG.md",
"Documentation": "https://github.com/jim-schilling/jpy-sync-db-lite#readme",
"Homepage": "https://github.com/jim-schilling/jpy-sync-db-lite",
"Repository": "https://github.com/jim-schilling/jpy-sync-db-lite.git"
},
"split_keywords": [
"sqlite",
" database",
" synchronous",
" threading",
" sqlalchemy",
" orm"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "80272901d1ff09fb928d1c3aaeee5fe7c3dac9597bfe4ee27f52d5e0c4c93f78",
"md5": "1d5ed388eded78a20309c1d20e9a7569",
"sha256": "7f9035fff5058d69266179fa71b8d0801f25bd6365883adc9f73537ff3516ca5"
},
"downloads": -1,
"filename": "jpy_sync_db_lite-0.3.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1d5ed388eded78a20309c1d20e9a7569",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 20623,
"upload_time": "2025-07-11T18:44:23",
"upload_time_iso_8601": "2025-07-11T18:44:23.752727Z",
"url": "https://files.pythonhosted.org/packages/80/27/2901d1ff09fb928d1c3aaeee5fe7c3dac9597bfe4ee27f52d5e0c4c93f78/jpy_sync_db_lite-0.3.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "191a6ea9afeb4afb46f825549eb111592438eea1c649676554725c513aed125a",
"md5": "1ae0fd9e22b81ae14a65ef923d28a9d4",
"sha256": "d3736363cfd87b00a2e88efbbc1c655b1e17d585027d2ae0a3c90f1ac7f1fde8"
},
"downloads": -1,
"filename": "jpy_sync_db_lite-0.3.1.tar.gz",
"has_sig": false,
"md5_digest": "1ae0fd9e22b81ae14a65ef923d28a9d4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 72938,
"upload_time": "2025-07-11T18:44:24",
"upload_time_iso_8601": "2025-07-11T18:44:24.923491Z",
"url": "https://files.pythonhosted.org/packages/19/1a/6ea9afeb4afb46f825549eb111592438eea1c649676554725c513aed125a/jpy_sync_db_lite-0.3.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-11 18:44:24",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "jim-schilling",
"github_project": "jpy-sync-db-lite",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "sqlalchemy",
"specs": [
[
">=",
"2.0.37"
]
]
},
{
"name": "sqlparse",
"specs": [
[
">=",
"0.4.0"
]
]
}
],
"lcname": "jpy-sync-db-lite"
}