<img width="64" height="64" alt="image" src="https://github.com/user-attachments/assets/663b4497-d023-49a6-9ce9-60c50c86df02" />
# Profilis
> A high performance, non-blocking profiler for Python web applications.
[](https://ankan97dutta.github.io/profilis/)
[](https://github.com/ankan97dutta/profilis/actions/workflows/ci.yml)
---
## Overview
Profilis provides drop-in observability across APIs, functions, and database queries with minimal performance impact. It's designed to be:
- **Non blocking**: Async collection with configurable batching and backpressure handling
- **Framework agnostic**: Works with Flask and custom applications (FastAPI/Sanic planned)
- **Database aware**: Built-in support for SQLAlchemy (pyodbc/MongoDB/Neo4j planned)
- **Production ready**: Configurable sampling, error tracking, and multiple export formats
<img width="1126" height="642" alt="Screenshot 2025-09-01 at 12 38 50 PM" src="https://github.com/user-attachments/assets/7c9d541b-4984-4575-92fb-8c0ec48dff55" />
## Features
- **Request Profiling**: Automatic HTTP request/response timing and status tracking
- **Function Profiling**: Decorator-based function timing with exception tracking
- **Database Instrumentation**: SQLAlchemy query performance monitoring with row counts
- **Built-in UI**: Real-time dashboard for monitoring and debugging
- **Multiple Exporters**: JSONL (with rotation), Console
- **Runtime Context**: Distributed tracing with trace/span ID management
- **Configurable Sampling**: Control data collection volume in production
## Installation
Install the core package with optional dependencies for your specific needs:
### Option 1: Using pip with extras (Recommended)
```bash
# Core package only
pip install profilis
# With Flask support
pip install profilis[flask]
# With database support
pip install profilis[flask,sqlalchemy]
# With all integrations
pip install profilis[all]
```
### Option 2: Using requirements files
```bash
# Minimal setup (core only)
pip install -r requirements-minimal.txt
# Flask integration
pip install -r requirements-flask.txt
# SQLAlchemy integration
pip install -r requirements-sqlalchemy.txt
# All integrations
pip install -r requirements-all.txt
```
### Option 3: Manual installation
```bash
# Core dependencies
pip install typing_extensions>=4.0
# Flask support
pip install flask[async]>=3.0
# SQLAlchemy support
pip install sqlalchemy>=2.0 aiosqlite greenlet
# Performance optimization
pip install orjson>=3.8
```
## Quick Start
### Flask Integration
```python
from flask import Flask
from profilis.flask.adapter import ProfilisFlask
from profilis.exporters.jsonl import JSONLExporter
from profilis.core.async_collector import AsyncCollector
# Setup exporter and collector
exporter = JSONLExporter(dir="./logs", rotate_bytes=1024*1024, rotate_secs=3600)
collector = AsyncCollector(exporter, queue_size=2048, batch_max=128, flush_interval=0.1)
# Create Flask app and integrate Profilis
app = Flask(__name__)
profilis = ProfilisFlask(
app,
collector=collector,
exclude_routes=["/health", "/metrics"],
sample=1.0 # 100% sampling
)
@app.route('/api/users')
def get_users():
return {"users": ["alice", "bob"]}
# Start the app
if __name__ == "__main__":
app.run(debug=True)
```
### Function Profiling
```python
from profilis.decorators.profile import profile_function
from profilis.core.emitter import Emitter
from profilis.exporters.console import ConsoleExporter
from profilis.core.async_collector import AsyncCollector
# Setup profiling
exporter = ConsoleExporter(pretty=True)
collector = AsyncCollector(exporter, queue_size=128, flush_interval=0.2)
emitter = Emitter(collector)
@profile_function(emitter)
def expensive_calculation(n: int) -> int:
"""This function will be automatically profiled."""
result = sum(i * i for i in range(n))
return result
@profile_function(emitter)
async def async_operation(data: list) -> list:
"""Async functions are also supported."""
processed = [item * 2 for item in data]
return processed
# Use the profiled functions
result = expensive_calculation(1000)
```
### Manual Event Emission
```python
from profilis.core.emitter import Emitter
from profilis.exporters.jsonl import JSONLExporter
from profilis.core.async_collector import AsyncCollector
from profilis.runtime import use_span, span_id
# Setup
exporter = JSONLExporter(dir="./logs")
collector = AsyncCollector(exporter)
emitter = Emitter(collector)
# Create a trace context
with use_span(trace_id=span_id()):
# Emit custom events
emitter.emit_req("/api/custom", 200, dur_ns=15000000) # 15ms
emitter.emit_fn("custom_function", dur_ns=5000000) # 5ms
emitter.emit_db("SELECT * FROM users", dur_ns=8000000, rows=100)
# Close collector to flush remaining events
collector.close()
```
### Built-in Dashboard
```python
from flask import Flask
from profilis.flask.ui import make_ui_blueprint
from profilis.core.stats import StatsStore
app = Flask(__name__)
stats = StatsStore() # 15-minute rolling window
# Mount the dashboard at /_profilis
ui_bp = make_ui_blueprint(stats, ui_prefix="/_profilis")
app.register_blueprint(ui_bp)
# Visit http://localhost:5000/_profilis to see the dashboard
```
## Advanced Usage
### Custom Exporters
```python
from profilis.core.async_collector import AsyncCollector
from profilis.exporters.base import BaseExporter
class CustomExporter(BaseExporter):
def export(self, events: list[dict]) -> None:
for event in events:
# Custom export logic
print(f"Custom export: {event}")
# Use custom exporter
exporter = CustomExporter()
collector = AsyncCollector(exporter)
```
### Runtime Context Management
```python
from profilis.runtime import use_span, span_id, get_trace_id, get_span_id
# Create distributed trace context
with use_span(trace_id="trace-123", span_id="span-456"):
current_trace = get_trace_id() # "trace-123"
current_span = get_span_id() # "span-456"
# Nested spans inherit trace context
with use_span(span_id="span-789"):
nested_span = get_span_id() # "span-789"
parent_trace = get_trace_id() # "trace-123"
```
### Performance Tuning
```python
from profilis.core.async_collector import AsyncCollector
# High-throughput configuration
collector = AsyncCollector(
exporter,
queue_size=8192, # Large queue for high concurrency
batch_max=256, # Larger batches for efficiency
flush_interval=0.05, # More frequent flushing
drop_oldest=True # Drop events under backpressure
)
# Low-latency configuration
collector = AsyncCollector(
exporter,
queue_size=512, # Smaller queue for lower latency
batch_max=32, # Smaller batches for faster processing
flush_interval=0.01, # Very frequent flushing
drop_oldest=False # Don't drop events
)
```
## Configuration
### Environment Variables
```bash
# Note: Environment variable support is planned for future releases
# Currently, all configuration is done programmatically
```
### Sampling Strategies
```python
# Random sampling
profilis = ProfilisFlask(app, collector=collector, sample=0.1) # 10% of requests
# Route-based sampling
profilis = ProfilisFlask(
app,
collector=collector,
exclude_routes=["/health", "/metrics", "/static"],
sample=1.0
)
```
## Exporters
### JSONL Exporter
```python
from profilis.exporters.jsonl import JSONLExporter
# With rotation
exporter = JSONLExporter(
dir="./logs",
rotate_bytes=1024*1024, # 1MB per file
rotate_secs=3600 # Rotate every hour
)
```
### Console Exporter
```python
from profilis.exporters.console import ConsoleExporter
# Pretty-printed output for development
exporter = ConsoleExporter(pretty=True)
# Compact output for production
exporter = ConsoleExporter(pretty=False)
```
## Performance Characteristics
- **Event Creation**: ≤15µs per event
- **Memory Overhead**: ~100 bytes per event
- **Throughput**: 100K+ events/second on modern hardware
- **Latency**: Sub-millisecond collection overhead
## Documentation
Full documentation is available at: [Profilis Docs](https://ankan97dutta.github.io/profilis/)
Docs are written in Markdown under [`docs/`](./docs) and built with [MkDocs Material](https://squidfunk.github.io/mkdocs-material/).
### Available Documentation
- **[Getting Started](https://ankan97dutta.github.io/profilis/guides/getting-started/)** - Quick setup and basic usage
- **[Configuration](https://ankan97dutta.github.io/profilis/guides/configuration/)** - Tuning and customization
- **[Flask Integration](https://ankan97dutta.github.io/profilis/adapters/flask/)** - Flask adapter documentation
- **[SQLAlchemy Support](https://ankan97dutta.github.io/profilis/databases/sqlalchemy/)** - Database instrumentation
- **[JSONL Exporter](https://ankan97dutta.github.io/profilis/exporters/jsonl/)** - Log file output
- **[Built-in UI](https://ankan97dutta.github.io/profilis/ui/ui/)** - Dashboard documentation
- **[Architecture](https://ankan97dutta.github.io/profilis/architecture/architecture/)** - System design
To preview locally:
```bash
pip install mkdocs mkdocs-material mkdocs-mermaid2-plugin
mkdocs serve
```
## Development
- See [Contributing](./docs/meta/contributing.md) and [Development Guidelines](./docs/meta/development-guidelines.md).
- Branch strategy: trunk‑based (`feat/*`, `fix/*`, `perf/*`, `chore/*`).
- Commits follow [Conventional Commits](https://www.conventionalcommits.org/).
## Roadmap
See [Profilis – v0 Roadmap Project](https://github.com/ankan97dutta/profilis/projects) and [`docs/overview/roadmap.md`](./docs/overview/roadmap.md).
## License
[MIT](./LICENSE)
Raw data
{
"_id": null,
"home_page": null,
"name": "profilis",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "profiler, python, flask, fastapi, sanic, sql, sqlalchemy, pyodbc, mongo, neo4j, asyncio, monitoring, performance",
"author": "Ankan Dutta",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/3d/2d/5bf7e1531e270f96417589dbf982fe123079c40b547760ccfb7137cc569a/profilis-0.1.0.tar.gz",
"platform": null,
"description": "<img width=\"64\" height=\"64\" alt=\"image\" src=\"https://github.com/user-attachments/assets/663b4497-d023-49a6-9ce9-60c50c86df02\" />\n\n# Profilis\n\n> A high performance, non-blocking profiler for Python web applications.\n\n[](https://ankan97dutta.github.io/profilis/)\n[](https://github.com/ankan97dutta/profilis/actions/workflows/ci.yml)\n---\n\n## Overview\n\nProfilis provides drop-in observability across APIs, functions, and database queries with minimal performance impact. It's designed to be:\n\n- **Non blocking**: Async collection with configurable batching and backpressure handling\n- **Framework agnostic**: Works with Flask and custom applications (FastAPI/Sanic planned)\n- **Database aware**: Built-in support for SQLAlchemy (pyodbc/MongoDB/Neo4j planned)\n- **Production ready**: Configurable sampling, error tracking, and multiple export formats\n\n<img width=\"1126\" height=\"642\" alt=\"Screenshot 2025-09-01 at 12 38 50\u202fPM\" src=\"https://github.com/user-attachments/assets/7c9d541b-4984-4575-92fb-8c0ec48dff55\" />\n\n\n## Features\n\n- **Request Profiling**: Automatic HTTP request/response timing and status tracking\n- **Function Profiling**: Decorator-based function timing with exception tracking\n- **Database Instrumentation**: SQLAlchemy query performance monitoring with row counts\n- **Built-in UI**: Real-time dashboard for monitoring and debugging\n- **Multiple Exporters**: JSONL (with rotation), Console\n- **Runtime Context**: Distributed tracing with trace/span ID management\n- **Configurable Sampling**: Control data collection volume in production\n\n\n## Installation\n\nInstall the core package with optional dependencies for your specific needs:\n\n### Option 1: Using pip with extras (Recommended)\n\n```bash\n# Core package only\npip install profilis\n\n# With Flask support\npip install profilis[flask]\n\n# With database support\npip install profilis[flask,sqlalchemy]\n\n# With all integrations\npip install profilis[all]\n```\n\n### Option 2: Using requirements files\n\n```bash\n# Minimal setup (core only)\npip install -r requirements-minimal.txt\n\n# Flask integration\npip install -r requirements-flask.txt\n\n# SQLAlchemy integration\npip install -r requirements-sqlalchemy.txt\n\n# All integrations\npip install -r requirements-all.txt\n```\n\n### Option 3: Manual installation\n\n```bash\n# Core dependencies\npip install typing_extensions>=4.0\n\n# Flask support\npip install flask[async]>=3.0\n\n# SQLAlchemy support\npip install sqlalchemy>=2.0 aiosqlite greenlet\n\n# Performance optimization\npip install orjson>=3.8\n```\n\n## Quick Start\n\n### Flask Integration\n\n```python\nfrom flask import Flask\nfrom profilis.flask.adapter import ProfilisFlask\nfrom profilis.exporters.jsonl import JSONLExporter\nfrom profilis.core.async_collector import AsyncCollector\n\n# Setup exporter and collector\nexporter = JSONLExporter(dir=\"./logs\", rotate_bytes=1024*1024, rotate_secs=3600)\ncollector = AsyncCollector(exporter, queue_size=2048, batch_max=128, flush_interval=0.1)\n\n# Create Flask app and integrate Profilis\napp = Flask(__name__)\nprofilis = ProfilisFlask(\n app,\n collector=collector,\n exclude_routes=[\"/health\", \"/metrics\"],\n sample=1.0 # 100% sampling\n)\n\n@app.route('/api/users')\ndef get_users():\n return {\"users\": [\"alice\", \"bob\"]}\n\n# Start the app\nif __name__ == \"__main__\":\n app.run(debug=True)\n```\n\n### Function Profiling\n\n```python\nfrom profilis.decorators.profile import profile_function\nfrom profilis.core.emitter import Emitter\nfrom profilis.exporters.console import ConsoleExporter\nfrom profilis.core.async_collector import AsyncCollector\n\n# Setup profiling\nexporter = ConsoleExporter(pretty=True)\ncollector = AsyncCollector(exporter, queue_size=128, flush_interval=0.2)\nemitter = Emitter(collector)\n\n@profile_function(emitter)\ndef expensive_calculation(n: int) -> int:\n \"\"\"This function will be automatically profiled.\"\"\"\n result = sum(i * i for i in range(n))\n return result\n\n@profile_function(emitter)\nasync def async_operation(data: list) -> list:\n \"\"\"Async functions are also supported.\"\"\"\n processed = [item * 2 for item in data]\n return processed\n\n# Use the profiled functions\nresult = expensive_calculation(1000)\n```\n\n### Manual Event Emission\n\n```python\nfrom profilis.core.emitter import Emitter\nfrom profilis.exporters.jsonl import JSONLExporter\nfrom profilis.core.async_collector import AsyncCollector\nfrom profilis.runtime import use_span, span_id\n\n# Setup\nexporter = JSONLExporter(dir=\"./logs\")\ncollector = AsyncCollector(exporter)\nemitter = Emitter(collector)\n\n# Create a trace context\nwith use_span(trace_id=span_id()):\n # Emit custom events\n emitter.emit_req(\"/api/custom\", 200, dur_ns=15000000) # 15ms\n emitter.emit_fn(\"custom_function\", dur_ns=5000000) # 5ms\n emitter.emit_db(\"SELECT * FROM users\", dur_ns=8000000, rows=100)\n\n# Close collector to flush remaining events\ncollector.close()\n```\n\n### Built-in Dashboard\n\n```python\nfrom flask import Flask\nfrom profilis.flask.ui import make_ui_blueprint\nfrom profilis.core.stats import StatsStore\n\napp = Flask(__name__)\nstats = StatsStore() # 15-minute rolling window\n\n# Mount the dashboard at /_profilis\nui_bp = make_ui_blueprint(stats, ui_prefix=\"/_profilis\")\napp.register_blueprint(ui_bp)\n\n# Visit http://localhost:5000/_profilis to see the dashboard\n```\n\n## Advanced Usage\n\n### Custom Exporters\n\n```python\nfrom profilis.core.async_collector import AsyncCollector\nfrom profilis.exporters.base import BaseExporter\n\nclass CustomExporter(BaseExporter):\n def export(self, events: list[dict]) -> None:\n for event in events:\n # Custom export logic\n print(f\"Custom export: {event}\")\n\n# Use custom exporter\nexporter = CustomExporter()\ncollector = AsyncCollector(exporter)\n```\n\n### Runtime Context Management\n\n```python\nfrom profilis.runtime import use_span, span_id, get_trace_id, get_span_id\n\n# Create distributed trace context\nwith use_span(trace_id=\"trace-123\", span_id=\"span-456\"):\n current_trace = get_trace_id() # \"trace-123\"\n current_span = get_span_id() # \"span-456\"\n\n # Nested spans inherit trace context\n with use_span(span_id=\"span-789\"):\n nested_span = get_span_id() # \"span-789\"\n parent_trace = get_trace_id() # \"trace-123\"\n```\n\n### Performance Tuning\n\n```python\nfrom profilis.core.async_collector import AsyncCollector\n\n# High-throughput configuration\ncollector = AsyncCollector(\n exporter,\n queue_size=8192, # Large queue for high concurrency\n batch_max=256, # Larger batches for efficiency\n flush_interval=0.05, # More frequent flushing\n drop_oldest=True # Drop events under backpressure\n)\n\n# Low-latency configuration\ncollector = AsyncCollector(\n exporter,\n queue_size=512, # Smaller queue for lower latency\n batch_max=32, # Smaller batches for faster processing\n flush_interval=0.01, # Very frequent flushing\n drop_oldest=False # Don't drop events\n)\n```\n\n## Configuration\n\n### Environment Variables\n\n```bash\n# Note: Environment variable support is planned for future releases\n# Currently, all configuration is done programmatically\n```\n\n### Sampling Strategies\n\n```python\n# Random sampling\nprofilis = ProfilisFlask(app, collector=collector, sample=0.1) # 10% of requests\n\n# Route-based sampling\nprofilis = ProfilisFlask(\n app,\n collector=collector,\n exclude_routes=[\"/health\", \"/metrics\", \"/static\"],\n sample=1.0\n)\n```\n\n## Exporters\n\n### JSONL Exporter\n```python\nfrom profilis.exporters.jsonl import JSONLExporter\n\n# With rotation\nexporter = JSONLExporter(\n dir=\"./logs\",\n rotate_bytes=1024*1024, # 1MB per file\n rotate_secs=3600 # Rotate every hour\n)\n```\n\n### Console Exporter\n```python\nfrom profilis.exporters.console import ConsoleExporter\n\n# Pretty-printed output for development\nexporter = ConsoleExporter(pretty=True)\n\n# Compact output for production\nexporter = ConsoleExporter(pretty=False)\n```\n\n## Performance Characteristics\n\n- **Event Creation**: \u226415\u00b5s per event\n- **Memory Overhead**: ~100 bytes per event\n- **Throughput**: 100K+ events/second on modern hardware\n- **Latency**: Sub-millisecond collection overhead\n\n## Documentation\n\nFull documentation is available at: [Profilis Docs](https://ankan97dutta.github.io/profilis/)\n\nDocs are written in Markdown under [`docs/`](./docs) and built with [MkDocs Material](https://squidfunk.github.io/mkdocs-material/).\n\n### Available Documentation\n\n- **[Getting Started](https://ankan97dutta.github.io/profilis/guides/getting-started/)** - Quick setup and basic usage\n- **[Configuration](https://ankan97dutta.github.io/profilis/guides/configuration/)** - Tuning and customization\n- **[Flask Integration](https://ankan97dutta.github.io/profilis/adapters/flask/)** - Flask adapter documentation\n- **[SQLAlchemy Support](https://ankan97dutta.github.io/profilis/databases/sqlalchemy/)** - Database instrumentation\n- **[JSONL Exporter](https://ankan97dutta.github.io/profilis/exporters/jsonl/)** - Log file output\n- **[Built-in UI](https://ankan97dutta.github.io/profilis/ui/ui/)** - Dashboard documentation\n- **[Architecture](https://ankan97dutta.github.io/profilis/architecture/architecture/)** - System design\n\nTo preview locally:\n```bash\npip install mkdocs mkdocs-material mkdocs-mermaid2-plugin\nmkdocs serve\n```\n\n## Development\n\n- See [Contributing](./docs/meta/contributing.md) and [Development Guidelines](./docs/meta/development-guidelines.md).\n- Branch strategy: trunk\u2011based (`feat/*`, `fix/*`, `perf/*`, `chore/*`).\n- Commits follow [Conventional Commits](https://www.conventionalcommits.org/).\n\n## Roadmap\n\nSee [Profilis \u2013 v0 Roadmap Project](https://github.com/ankan97dutta/profilis/projects) and [`docs/overview/roadmap.md`](./docs/overview/roadmap.md).\n\n## License\n\n[MIT](./LICENSE)\n",
"bugtrack_url": null,
"license": null,
"summary": "High performance, non blocking profiler for Python web apps.",
"version": "0.1.0",
"project_urls": {
"Bug Tracker": "https://github.com/ankan97dutta/profilis/issues",
"Documentation": "https://ankan97dutta.github.io/profilis/",
"Homepage": "https://github.com/ankan97dutta/profilis"
},
"split_keywords": [
"profiler",
" python",
" flask",
" fastapi",
" sanic",
" sql",
" sqlalchemy",
" pyodbc",
" mongo",
" neo4j",
" asyncio",
" monitoring",
" performance"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "922d7d9b55c7a11d4f086589049d1149aa5929abcf1b68445ed729fc3a8ddae1",
"md5": "021baa8c7cb48eee35d677eab087bbf6",
"sha256": "10da8121bc9cb7c2c9f1c96a83e0b768aa6820e6465aac6139616a8c88056313"
},
"downloads": -1,
"filename": "profilis-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "021baa8c7cb48eee35d677eab087bbf6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 25636,
"upload_time": "2025-09-01T13:47:23",
"upload_time_iso_8601": "2025-09-01T13:47:23.198953Z",
"url": "https://files.pythonhosted.org/packages/92/2d/7d9b55c7a11d4f086589049d1149aa5929abcf1b68445ed729fc3a8ddae1/profilis-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "3d2d5bf7e1531e270f96417589dbf982fe123079c40b547760ccfb7137cc569a",
"md5": "6c5f0423b1b647cc3aff1596701151c4",
"sha256": "486c7b5674e1d3a098e13ab94953179621f05a0fefeb52ecdd0c8e2378a769f2"
},
"downloads": -1,
"filename": "profilis-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "6c5f0423b1b647cc3aff1596701151c4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 32014,
"upload_time": "2025-09-01T13:47:24",
"upload_time_iso_8601": "2025-09-01T13:47:24.655592Z",
"url": "https://files.pythonhosted.org/packages/3d/2d/5bf7e1531e270f96417589dbf982fe123079c40b547760ccfb7137cc569a/profilis-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-01 13:47:24",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ankan97dutta",
"github_project": "profilis",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "typing_extensions",
"specs": [
[
">=",
"4.0"
]
]
}
],
"lcname": "profilis"
}