# Python Library PyPI Boilerplate
## Project Structure
## Quick Start
### 1. Create Configuration File
Create a `.py-perf.yaml` file in your project directory:
```yaml
py_perf:
enabled: true
min_execution_time: 0.001
local:
enabled: true
data_dir: "./perf_data"
format: "json"
filters:
exclude_modules:
- "requests"
- "boto3"
```
### 2. Use in Your Code
```python
from py_perf import PyPerf
import time
# Initialize the performance tracker (loads .py-perf.yaml automatically)
perf = PyPerf()
# Method 1: Use as decorator
@perf.time_it
def slow_function(n):
time.sleep(0.1)
return sum(range(n))
# Method 2: Use as decorator with arguments
@perf.time_it(store_args=True)
def process_data(data, multiplier=2):
return [x * multiplier for x in data]
# Call your functions
result1 = slow_function(1000)
result2 = process_data([1, 2, 3, 4, 5])
# Performance data is automatically collected and uploaded
# - Local mode: Saved to ./perf_data/ as JSON files
# - AWS mode: Uploaded to DynamoDB on program exit
# - View data using the web dashboard at http://localhost:8000
# Optional: Get timing results programmatically
summary = perf.get_summary()
print(f"Tracked {summary['call_count']} function calls")
```
### 3. View Results
**Automatic Data Collection:**
- **Local Mode**: Performance data is automatically saved to `./perf_data/` as JSON files
- **AWS Mode**: Data is automatically uploaded to DynamoDB when your program exits
**Web Dashboard:**
For visualizing performance data, use the separate [py-perf-viewer](https://github.com/jeremycharlesgillespie/py-perf-viewer) Django dashboard that provides:
- Performance overview and metrics
- Function-by-function analysis
- Historical trends and comparisons
- Advanced filtering and search
For AWS integration and production setup, see the Configuration section below.
## Building and Publishing
This project includes automated scripts for building and publishing the package with automatic version incrementing.
### Quick Build & Upload
```bash
# Build and upload in one step (recommended)
./upload_package.sh
```
This script will:
- Automatically increment the version (0.1.4 → 0.1.5)
- Build the package
- Validate the package
- Give you options to upload to Test PyPI or Production PyPI
- Use your configured API tokens from `.pypirc`
### Build Only
```bash
# Just build the package (increments version)
python3 build_package.py
```
This script will:
- Automatically increment the version
- Clean previous builds
- Build both wheel and source distribution
- Validate the package
### Manual Build (Advanced)
```bash
# Traditional manual build (no version increment)
python -m build
# Manual upload to PyPI
pip install twine
twine upload dist/*
```
### Version Management
- **Automatic**: Both scripts automatically increment version by 0.01 each run
- **Current version**: Check with `python3 -c "from version_manager import get_current_version; print(get_current_version())"`
- **Manual increment**: Run `python3 version_manager.py`
### PyPI Configuration
Create a `.pypirc` file in the project root with your API tokens:
```ini
[distutils]
index-servers =
pypi
testpypi
[pypi]
username = __token__
password = your-pypi-api-token
[testpypi]
repository = https://test.pypi.org/legacy/
username = __token__
password = your-testpypi-api-token
```
### Testing Your Package
Test on Test PyPI first (recommended):
1. Run `./upload_package.sh` and choose option 1 (Test PyPI)
2. Install and test: `pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple/ py-perf-jg`
3. If everything works, upload to production PyPI
# Py-Perf
This library is used to track and represent the performance of Python code that is executed via an easy to install and configure Python library.
## Installation
```bash
pip install py-perf
```
## Development
### Setup Development Environment
```bash
# Clone the repository
git clone https://github.com/jeremycharlesgillespie/py-perf.git
cd py-perf
# Create virtual environment
python3 -m venv venv
# Activate virtual environment
source venv/bin/activate # On macOS/Linux
# or
venv\Scripts\activate # On Windows
# Install dependencies
pip install -r requirements.txt
# For development (includes testing tools)
pip install -r requirements-dev.txt
# Install pre-commit hooks (optional)
pre-commit install
```
## Configuration
PyPerf uses YAML configuration files for flexible and easy setup. Configuration sources are loaded in priority order:
1. **Default configuration** (built-in defaults)
2. **User configuration files** (`.py-perf.yaml`, `py-perf.yaml`)
3. **Runtime overrides** (passed to PyPerf constructor)
### Quick Start - Local Mode (No AWS Required)
Create a `.py-perf.yaml` file in your project directory:
```yaml
py_perf:
enabled: true
debug: false
min_execution_time: 0.001
local:
enabled: true # Use local storage, no AWS required
data_dir: "./perf_data"
format: "json"
max_records: 1000
filters:
exclude_modules:
- "boto3"
- "requests"
- "urllib3"
track_arguments: false
```
### AWS DynamoDB Mode
For production AWS usage:
```yaml
py_perf:
enabled: true
min_execution_time: 0.001
aws:
region: "us-east-1"
table_name: "py-perf-data"
auto_create_table: true
upload:
strategy: "on_exit" # on_exit, real_time, batch, manual
local:
enabled: false # Disable local storage
```
### Advanced Configuration
See `.py-perf.yaml.example` for all configuration options including:
- **Performance filtering** (modules, functions, execution time thresholds)
- **Upload strategies** (real-time, batch, manual)
- **Logging configuration**
- **Dashboard settings**
### Runtime Configuration
You can also configure PyPerf programmatically:
```python
from py_perf import PyPerf
# Local-only mode
perf = PyPerf({
"local": {"enabled": True},
"py_perf": {"debug": True}
})
# AWS mode with custom settings
perf = PyPerf({
"aws": {
"region": "us-east-1",
"table_name": "my-perf-data"
},
"local": {"enabled": False}
})
```
### Configuration File Locations
PyPerf searches for configuration files in this order:
1. `./py-perf.yaml` (current directory)
2. `./.py-perf.yaml` (current directory, hidden file)
3. `~/.py-perf.yaml` (home directory)
4. `~/.config/py-perf/config.yaml` (XDG config directory)
### AWS Setup
For AWS mode:
1. Configure AWS CLI: `aws configure`
2. Create your `.py-perf.yaml` with AWS settings
3. PyPerf will automatically create DynamoDB tables if needed
See `AWS_SETUP.md` for detailed AWS configuration instructions.
### Virtual Environment Usage
Always activate the virtual environment before running PyPerf:
```bash
# Activate virtual environment
source venv/bin/activate
# Run your PyPerf application
python3 tester.py
# Deactivate when done
deactivate
```
## Web Dashboard
For a comprehensive web dashboard to visualize and analyze your py-perf performance data, use the separate [py-perf-viewer](https://github.com/jeremycharlesgillespie/py-perf-viewer) project.
### py-perf-viewer Features
- **Performance Overview**: Key metrics, slowest functions, most active hosts
- **Advanced Filtering**: Filter by hostname, date range, function name, session ID
- **Sorting**: Sort records by timestamp, hostname, total calls, wall time, etc.
- **Function Analysis**: Detailed performance analysis for specific functions
- **REST API**: Programmatic access to performance data
- **Real-time Data**: Automatically displays latest performance data from DynamoDB
### Installation
```bash
# Install the dashboard separately
pip install py-perf-viewer
# Or clone and run the standalone project
git clone https://github.com/jeremycharlesgillespie/py-perf-viewer
cd py-perf-viewer
pip install -r requirements.txt
python start_viewer.py
```
Visit the [py-perf-viewer repository](https://github.com/jeremycharlesgillespie/py-perf-viewer) for detailed setup and usage instructions.
### Dashboard Screenshots
Here's what you can expect when using the py-perf-viewer web dashboard to visualize your py-perf performance data:
#### Main Dashboard Overview

*Complete performance overview with key metrics, slowest functions, and recent activity*
#### Performance Records Browser

*Browse and filter all performance records with advanced search capabilities*
#### Detailed Record Analysis

*Detailed breakdown of individual performance records showing function-level metrics*
#### Function Performance Analysis

*Cross-record analysis of specific functions with performance trends over time*
### REST API Endpoints
The dashboard also provides JSON API endpoints for programmatic access:
#### Performance Metrics API
`GET /api/metrics/` - Summary performance metrics in JSON format
```json
{
"total_records": 2,
"total_sessions": 2,
"unique_hostnames": ["Mac.home.local"],
"unique_functions": [
"slow_io_operation",
"cpu_intensive_task",
"check_aws_credentials",
"mixed_workload",
"fast_calculation",
"variable_duration"
],
"avg_session_duration": 0.0,
"slowest_functions": [
["check_aws_credentials", 0.294],
["slow_io_operation", 0.105],
["mixed_workload", 0.055]
],
"most_active_hosts": [
["Mac.home.local", 14]
]
}
```
#### Hostnames API
`GET /api/hostnames/` - List of unique hostnames for filtering
```json
{
"hostnames": ["Mac.home.local"]
}
```
#### Functions API
`GET /api/functions/` - List of unique function names for analysis
```json
{
"functions": [
"check_aws_credentials",
"cpu_intensive_task",
"fast_calculation",
"mixed_workload",
"slow_io_operation",
"variable_duration"
]
}
```
### Running PyPerf Library Tests
```bash
# Activate virtual environment
source venv/bin/activate
# Run PyPerf library tests
python tester.py
```
### Code Formatting
```bash
# Activate virtual environment
source venv/bin/activate
# Format code (if dev dependencies installed)
black src tests
isort src tests
flake8 src tests
mypy src
```
## License
MIT License - see LICENSE file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "py-perf-jg",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "performance, profiling, monitoring, timing, benchmarking",
"author": null,
"author_email": "Jeremy Gillespie <metalgear386@googlemail.com>",
"download_url": "https://files.pythonhosted.org/packages/4c/7c/df66b9cb48cd57c927ae6a78be1990fedb7f5df59c273a3d45ea7e13e52f/py_perf_jg-0.2.0.tar.gz",
"platform": null,
"description": "# Python Library PyPI Boilerplate\n\n## Project Structure\n\n## Quick Start\n\n### 1. Create Configuration File\n\nCreate a `.py-perf.yaml` file in your project directory:\n\n```yaml\npy_perf:\n enabled: true\n min_execution_time: 0.001\n\nlocal:\n enabled: true\n data_dir: \"./perf_data\"\n format: \"json\"\n\nfilters:\n exclude_modules:\n - \"requests\"\n - \"boto3\"\n```\n\n### 2. Use in Your Code\n\n```python\nfrom py_perf import PyPerf\nimport time\n\n# Initialize the performance tracker (loads .py-perf.yaml automatically)\nperf = PyPerf()\n\n# Method 1: Use as decorator\n@perf.time_it\ndef slow_function(n):\n time.sleep(0.1)\n return sum(range(n))\n\n# Method 2: Use as decorator with arguments\n@perf.time_it(store_args=True)\ndef process_data(data, multiplier=2):\n return [x * multiplier for x in data]\n\n# Call your functions\nresult1 = slow_function(1000)\nresult2 = process_data([1, 2, 3, 4, 5])\n\n# Performance data is automatically collected and uploaded\n# - Local mode: Saved to ./perf_data/ as JSON files\n# - AWS mode: Uploaded to DynamoDB on program exit\n# - View data using the web dashboard at http://localhost:8000\n\n# Optional: Get timing results programmatically\nsummary = perf.get_summary()\nprint(f\"Tracked {summary['call_count']} function calls\")\n```\n\n### 3. View Results\n\n**Automatic Data Collection:**\n- **Local Mode**: Performance data is automatically saved to `./perf_data/` as JSON files\n- **AWS Mode**: Data is automatically uploaded to DynamoDB when your program exits\n\n**Web Dashboard:**\nFor visualizing performance data, use the separate [py-perf-viewer](https://github.com/jeremycharlesgillespie/py-perf-viewer) Django dashboard that provides:\n- Performance overview and metrics\n- Function-by-function analysis \n- Historical trends and comparisons\n- Advanced filtering and search\n\nFor AWS integration and production setup, see the Configuration section below.\n\n\n## Building and Publishing\n\nThis project includes automated scripts for building and publishing the package with automatic version incrementing.\n\n### Quick Build & Upload\n\n```bash\n# Build and upload in one step (recommended)\n./upload_package.sh\n```\n\nThis script will:\n- Automatically increment the version (0.1.4 \u2192 0.1.5)\n- Build the package\n- Validate the package\n- Give you options to upload to Test PyPI or Production PyPI\n- Use your configured API tokens from `.pypirc`\n\n### Build Only\n\n```bash\n# Just build the package (increments version)\npython3 build_package.py\n```\n\nThis script will:\n- Automatically increment the version\n- Clean previous builds\n- Build both wheel and source distribution\n- Validate the package\n\n### Manual Build (Advanced)\n\n```bash\n# Traditional manual build (no version increment)\npython -m build\n\n# Manual upload to PyPI\npip install twine\ntwine upload dist/*\n```\n\n### Version Management\n\n- **Automatic**: Both scripts automatically increment version by 0.01 each run\n- **Current version**: Check with `python3 -c \"from version_manager import get_current_version; print(get_current_version())\"`\n- **Manual increment**: Run `python3 version_manager.py`\n\n### PyPI Configuration\n\nCreate a `.pypirc` file in the project root with your API tokens:\n\n```ini\n[distutils]\nindex-servers =\n pypi\n testpypi\n\n[pypi]\nusername = __token__\npassword = your-pypi-api-token\n\n[testpypi]\nrepository = https://test.pypi.org/legacy/\nusername = __token__\npassword = your-testpypi-api-token\n```\n\n### Testing Your Package\n\nTest on Test PyPI first (recommended):\n1. Run `./upload_package.sh` and choose option 1 (Test PyPI)\n2. Install and test: `pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple/ py-perf-jg`\n3. If everything works, upload to production PyPI\n\n\n# Py-Perf\n\nThis library is used to track and represent the performance of Python code that is executed via an easy to install and configure Python library.\n\n## Installation\n\n```bash\npip install py-perf\n```\n\n## Development\n\n### Setup Development Environment\n\n```bash\n# Clone the repository\ngit clone https://github.com/jeremycharlesgillespie/py-perf.git\ncd py-perf\n\n# Create virtual environment\npython3 -m venv venv\n\n# Activate virtual environment\nsource venv/bin/activate # On macOS/Linux\n# or\nvenv\\Scripts\\activate # On Windows\n\n# Install dependencies\npip install -r requirements.txt\n\n# For development (includes testing tools)\npip install -r requirements-dev.txt\n\n# Install pre-commit hooks (optional)\npre-commit install\n```\n\n## Configuration\n\nPyPerf uses YAML configuration files for flexible and easy setup. Configuration sources are loaded in priority order:\n\n1. **Default configuration** (built-in defaults)\n2. **User configuration files** (`.py-perf.yaml`, `py-perf.yaml`)\n3. **Runtime overrides** (passed to PyPerf constructor)\n\n### Quick Start - Local Mode (No AWS Required)\n\nCreate a `.py-perf.yaml` file in your project directory:\n\n```yaml\npy_perf:\n enabled: true\n debug: false\n min_execution_time: 0.001\n\nlocal:\n enabled: true # Use local storage, no AWS required\n data_dir: \"./perf_data\"\n format: \"json\"\n max_records: 1000\n\nfilters:\n exclude_modules:\n - \"boto3\"\n - \"requests\"\n - \"urllib3\"\n track_arguments: false\n```\n\n### AWS DynamoDB Mode\n\nFor production AWS usage:\n\n```yaml\npy_perf:\n enabled: true\n min_execution_time: 0.001\n\naws:\n region: \"us-east-1\"\n table_name: \"py-perf-data\"\n auto_create_table: true\n\nupload:\n strategy: \"on_exit\" # on_exit, real_time, batch, manual\n\nlocal:\n enabled: false # Disable local storage\n```\n\n### Advanced Configuration\n\nSee `.py-perf.yaml.example` for all configuration options including:\n\n- **Performance filtering** (modules, functions, execution time thresholds)\n- **Upload strategies** (real-time, batch, manual)\n- **Logging configuration**\n- **Dashboard settings**\n\n### Runtime Configuration\n\nYou can also configure PyPerf programmatically:\n\n```python\nfrom py_perf import PyPerf\n\n# Local-only mode\nperf = PyPerf({\n \"local\": {\"enabled\": True},\n \"py_perf\": {\"debug\": True}\n})\n\n# AWS mode with custom settings\nperf = PyPerf({\n \"aws\": {\n \"region\": \"us-east-1\",\n \"table_name\": \"my-perf-data\"\n },\n \"local\": {\"enabled\": False}\n})\n```\n\n### Configuration File Locations\n\nPyPerf searches for configuration files in this order:\n\n1. `./py-perf.yaml` (current directory)\n2. `./.py-perf.yaml` (current directory, hidden file)\n3. `~/.py-perf.yaml` (home directory)\n4. `~/.config/py-perf/config.yaml` (XDG config directory)\n\n### AWS Setup\n\nFor AWS mode:\n1. Configure AWS CLI: `aws configure`\n2. Create your `.py-perf.yaml` with AWS settings\n3. PyPerf will automatically create DynamoDB tables if needed\n\nSee `AWS_SETUP.md` for detailed AWS configuration instructions.\n\n### Virtual Environment Usage\n\nAlways activate the virtual environment before running PyPerf:\n\n```bash\n# Activate virtual environment\nsource venv/bin/activate\n\n# Run your PyPerf application\npython3 tester.py\n\n# Deactivate when done\ndeactivate\n```\n\n## Web Dashboard\n\nFor a comprehensive web dashboard to visualize and analyze your py-perf performance data, use the separate [py-perf-viewer](https://github.com/jeremycharlesgillespie/py-perf-viewer) project.\n\n### py-perf-viewer Features\n\n- **Performance Overview**: Key metrics, slowest functions, most active hosts\n- **Advanced Filtering**: Filter by hostname, date range, function name, session ID\n- **Sorting**: Sort records by timestamp, hostname, total calls, wall time, etc.\n- **Function Analysis**: Detailed performance analysis for specific functions\n- **REST API**: Programmatic access to performance data\n- **Real-time Data**: Automatically displays latest performance data from DynamoDB\n\n### Installation\n\n```bash\n# Install the dashboard separately\npip install py-perf-viewer\n\n# Or clone and run the standalone project\ngit clone https://github.com/jeremycharlesgillespie/py-perf-viewer\ncd py-perf-viewer\npip install -r requirements.txt\npython start_viewer.py\n```\n\nVisit the [py-perf-viewer repository](https://github.com/jeremycharlesgillespie/py-perf-viewer) for detailed setup and usage instructions.\n\n### Dashboard Screenshots\n\nHere's what you can expect when using the py-perf-viewer web dashboard to visualize your py-perf performance data:\n\n#### Main Dashboard Overview\n\n*Complete performance overview with key metrics, slowest functions, and recent activity*\n\n#### Performance Records Browser\n\n*Browse and filter all performance records with advanced search capabilities*\n\n#### Detailed Record Analysis\n\n*Detailed breakdown of individual performance records showing function-level metrics*\n\n#### Function Performance Analysis\n\n*Cross-record analysis of specific functions with performance trends over time*\n\n### REST API Endpoints\n\nThe dashboard also provides JSON API endpoints for programmatic access:\n\n#### Performance Metrics API\n`GET /api/metrics/` - Summary performance metrics in JSON format\n\n```json\n{\n \"total_records\": 2,\n \"total_sessions\": 2,\n \"unique_hostnames\": [\"Mac.home.local\"],\n \"unique_functions\": [\n \"slow_io_operation\",\n \"cpu_intensive_task\", \n \"check_aws_credentials\",\n \"mixed_workload\",\n \"fast_calculation\",\n \"variable_duration\"\n ],\n \"avg_session_duration\": 0.0,\n \"slowest_functions\": [\n [\"check_aws_credentials\", 0.294],\n [\"slow_io_operation\", 0.105],\n [\"mixed_workload\", 0.055]\n ],\n \"most_active_hosts\": [\n [\"Mac.home.local\", 14]\n ]\n}\n```\n\n#### Hostnames API\n`GET /api/hostnames/` - List of unique hostnames for filtering\n\n```json\n{\n \"hostnames\": [\"Mac.home.local\"]\n}\n```\n\n#### Functions API\n`GET /api/functions/` - List of unique function names for analysis\n\n```json\n{\n \"functions\": [\n \"check_aws_credentials\",\n \"cpu_intensive_task\",\n \"fast_calculation\",\n \"mixed_workload\", \n \"slow_io_operation\",\n \"variable_duration\"\n ]\n}\n```\n\n### Running PyPerf Library Tests\n\n```bash\n# Activate virtual environment\nsource venv/bin/activate\n\n# Run PyPerf library tests\npython tester.py\n```\n\n### Code Formatting\n\n```bash\n# Activate virtual environment\nsource venv/bin/activate\n\n# Format code (if dev dependencies installed)\nblack src tests\nisort src tests\nflake8 src tests\nmypy src\n```\n\n## License\n\nMIT License - see LICENSE file for details.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A lightweight Python performance tracking library with automatic data collection and visualization",
"version": "0.2.0",
"project_urls": {
"Bug Tracker": "https://github.com/jeremycharlesgillespie/py-perf/issues",
"Documentation": "https://py-perf.readthedocs.io/",
"Homepage": "https://github.com/jeremycharlesgillespie/py-perf",
"Repository": "https://github.com/jeremycharlesgillespie/py-perf"
},
"split_keywords": [
"performance",
" profiling",
" monitoring",
" timing",
" benchmarking"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "29138a5768497eb38cfd06fb3addbfe37ab1da0af4b333abaa2b69f87c75b6c6",
"md5": "2df49af3c3b6b41c8eaec772eb6339ca",
"sha256": "6ab6bcddcd84d098b3d3670234ba8a1f9ccb7a91823ff6e1264c266a72f137bc"
},
"downloads": -1,
"filename": "py_perf_jg-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "2df49af3c3b6b41c8eaec772eb6339ca",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 15765,
"upload_time": "2025-07-22T01:03:00",
"upload_time_iso_8601": "2025-07-22T01:03:00.055728Z",
"url": "https://files.pythonhosted.org/packages/29/13/8a5768497eb38cfd06fb3addbfe37ab1da0af4b333abaa2b69f87c75b6c6/py_perf_jg-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "4c7cdf66b9cb48cd57c927ae6a78be1990fedb7f5df59c273a3d45ea7e13e52f",
"md5": "4074adb59561ffe7f5217689c74c84c1",
"sha256": "5330d9ea5132dc4e6d9bcd12010a0ca3c2a0053ac021582cbc18ca9173c8077e"
},
"downloads": -1,
"filename": "py_perf_jg-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "4074adb59561ffe7f5217689c74c84c1",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 21209,
"upload_time": "2025-07-22T01:03:01",
"upload_time_iso_8601": "2025-07-22T01:03:01.431396Z",
"url": "https://files.pythonhosted.org/packages/4c/7c/df66b9cb48cd57c927ae6a78be1990fedb7f5df59c273a3d45ea7e13e52f/py_perf_jg-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-22 01:03:01",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "jeremycharlesgillespie",
"github_project": "py-perf",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "boto3",
"specs": [
[
">=",
"1.26.0"
]
]
},
{
"name": "omegaconf",
"specs": [
[
">=",
"2.3.0"
]
]
},
{
"name": "Django",
"specs": [
[
">=",
"4.2.0"
]
]
},
{
"name": "djangorestframework",
"specs": [
[
">=",
"3.14.0"
]
]
}
],
"lcname": "py-perf-jg"
}