# Finomeny Logger
[](https://badge.fury.io/py/finomeny-logger)
[](https://pypi.org/project/finomeny-logger/)
[](https://opensource.org/licenses/MIT)
[](https://codecov.io/gh/finomeny/finomeny-logger)
Production-ready structured logging library for Finomeny services. Provides consistent, JSON-structured logs across AWS services with automatic context detection, PII redaction, and comprehensive observability features.
## ๐ Features
- **Structured JSON Logging**: Enforces consistent log format across all services
- **AWS Auto-Detection**: Automatically detects Lambda, EC2, and other AWS contexts
- **PII Protection**: Built-in PII detection and redaction with configurable strategies
- **Schema Validation**: JSON Schema validation for log consistency
- **Distributed Tracing**: Correlation IDs and trace IDs for request tracking
- **Domain Contexts**: Pre-built context blocks for Snowflake, Salesforce, Airflow, etc.
- **Performance Monitoring**: Built-in operation tracing with timing metrics
- **Compliance Ready**: GDPR-compliant logging with automatic data classification
## ๐ฆ Installation
```bash
pip install finomeny-logger
```
For development:
```bash
pip install finomeny-logger[dev]
```
## ๐ Quick Start
### Basic Usage
```python
from finomeny_logger import FinomenyLogger, LogCategory
# Initialize logger (auto-detects AWS context)
logger = FinomenyLogger(
service="portfolio-api",
component="lambda",
version="1.2.3"
)
# Log business events
logger.info(
"Portfolio.Created",
"New portfolio created successfully",
category=LogCategory.BUSINESS_TRANSACTION,
portfolio_id="PORT-12345",
actor_id="user123",
metrics={"portfolios_created": 1}
)
# Log with error handling
try:
risky_operation()
except Exception as e:
logger.error(
"Portfolio.CreationFailed",
"Portfolio creation failed",
error=e,
portfolio_id="PORT-12345"
)
```
### AWS Lambda Function
```python
from finomeny_logger import FinomenyLogger, create_ingestion_context
def lambda_handler(event, context):
logger = FinomenyLogger(
service="portfolio-ingester",
component="lambda",
version="2.1.4"
)
portfolio_id = event['portfolio_id']
# Use operation tracing for automatic timing
with logger.trace_operation(
"ProcessPortfolio",
portfolio_id=portfolio_id,
source_system="s3",
target_system="snowflake"
) as tracer:
# Your processing logic here
process_portfolio_file(event['s3_key'])
# Log with domain context
logger.info(
"Portfolio.Processed",
f"Portfolio {portfolio_id} processed successfully",
portfolio_id=portfolio_id,
metrics={"rows_processed": 10000},
**create_ingestion_context(
source_type="xls",
file_key=event['s3_key']
)
)
return {"statusCode": 200}
```
### Airflow Integration
```python
from finomeny_logger import FinomenyLogger, create_airflow_context
def my_airflow_task(**context):
logger = FinomenyLogger(
service="data-pipeline",
component="airflow",
version="1.0.0"
)
# Extract Airflow context
dag_id = context['dag'].dag_id
task_id = context['task'].task_id
run_id = context['run_id']
logger.info(
"ETL.TaskStarted",
f"Starting ETL task: {task_id}",
**create_airflow_context(dag_id, task_id, run_id)
)
# Your ETL logic here
```
## ๐ Structured Log Format
Every log follows this structured format:
```json
{
"ts": "2025-09-11T09:30:15.123Z",
"env": "prod",
"service": "portfolio-ingester",
"component": "lambda",
"version": "2.1.4",
"region": "eu-west-2",
"level": "INFO",
"category": "BusinessTransaction",
"event": "Portfolio.Processed",
"message": "Portfolio processing completed",
"trace_id": "0f8fad5b-d9cb-469f-a165-70867728950e",
"correlation_id": "req-8d3e1c1b9f",
"portfolio_id": "PORT-12345",
"actor_id": "user123",
"pii_flags": ["none"],
"metrics": {
"rows_processed": 10000,
"latency_ms": 1250
},
"tags": ["portfolio", "processing"],
"kvs": {
"file_key": "data/portfolio.xlsx"
}
}
```
### Required Fields
- `ts`: ISO 8601 timestamp
- `env`: Environment (dev/stg/prod)
- `service`: Service name
- `component`: Component type (lambda/api/airflow/etc)
- `version`: Service version
- `region`: AWS region
- `level`: Log level (DEBUG/INFO/WARN/ERROR/CRITICAL)
- `category`: Event category (Security/Compliance/BusinessTransaction/Engagement/TechnicalOps)
- `event`: Event name (PascalCase with dots)
- `message`: Human-readable message
- `pii_flags`: PII classification
- `metrics`: Quantitative data
- `kvs`: Additional key-value pairs
## ๐ PII Protection
The logger automatically detects and redacts PII:
```python
# Automatic PII redaction
logger.info(
"User.ContactUpdated",
"User updated contact: john.doe@example.com and +1-555-123-4567",
# Output: "User updated contact: [REDACTED_EMAIL] and [REDACTED_PHONE]"
pii_flags=["contains-pii"]
)
# Sensitive ID hashing
logger.info(
"Payment.Processed",
"Payment processed for debtor",
debtor_id="DEBT-12345" # Automatically hashed to protect PII
)
```
## ๐ Domain Contexts
Use pre-built context blocks for common integrations:
### Snowflake Context
```python
from finomeny_logger import create_snowflake_context
logger.info(
"Transform.Complete",
"Data transformation finished",
**create_snowflake_context(
query_id="01a12345-0400-5db1-0000-0f5c00a1bdf6",
rows_affected=10000,
credit_cost_est=0.05
)
)
```
### Salesforce Context
```python
logger.info(
"Salesforce.UpsertComplete",
"Records upserted to Salesforce",
salesforce_ctx={
"api": "Bulk",
"object": "Debt__c",
"operation": "upsert",
"batch_id": "751xxxxxxxxxxxx",
"success_count": 9950,
"error_count": 50
}
)
```
### Ingestion Context
```python
from finomeny_logger import create_ingestion_context
logger.info(
"File.Processed",
"File ingestion completed",
**create_ingestion_context(
source_type="csv",
file_key="data/import.csv",
checksum="sha256:abc123",
headers_detected=True
)
)
```
## ๐ฏ Log Categories & Levels
### Categories
- `Security`: Authentication, authorization, access control
- `Compliance`: GDPR, audit trails, regulatory events
- `BusinessTransaction`: Core business operations
- `Engagement`: User interactions, communications
- `TechnicalOps`: Infrastructure, performance, errors
### Levels
- `DEBUG`: Detailed diagnostic information
- `INFO`: General operational messages
- `WARN`: Warning conditions that should be addressed
- `ERROR`: Error conditions that don't stop operation
- `CRITICAL`: Critical errors requiring immediate attention
## โก Performance Monitoring
Built-in operation tracing with automatic timing:
```python
# Automatic timing and error handling
with logger.trace_operation(
"DatabaseQuery",
portfolio_id="PORT-123",
metrics={"query_complexity": "high"}
) as tracer:
# Your operation here
result = execute_complex_query()
# Timing automatically added to metrics
# Errors automatically logged if exception occurs
```
## ๐ง Configuration
### Environment Variables
The logger auto-detects configuration from environment:
- `ENVIRONMENT` / `ENV` / `STAGE`: Environment name
- `AWS_REGION`: AWS region
- `AWS_LAMBDA_FUNCTION_NAME`: Lambda function name
- `_X_AMZN_TRACE_ID`: AWS request tracing
### Initialization Options
```python
logger = FinomenyLogger(
service="my-service",
component="lambda",
version="1.0.0",
env="prod", # Override auto-detection
region="us-east-1", # Override auto-detection
auto_detect_aws=True, # Enable AWS context detection
validate_schema=True, # Enable JSON schema validation
redact_pii=True, # Enable PII redaction
max_error_stack_size=8192 # Limit error stack size
)
```
## ๐ AWS Integration
### CloudWatch Logs
Logs automatically flow to CloudWatch when running in Lambda:
```python
# In Lambda, logs go directly to CloudWatch
logger.info("Lambda.Started", "Function execution started")
```
### S3 Data Lake
Configure EventBridge rules to route logs to S3:
```
Pattern: s3://logs/{env}/{service}/dt=YYYY-MM-DD/region={region}/
```
### OpenSearch Indexing
Recommended index pattern:
```
Index: {env}-{service}-yyyy.mm.dd
Partition: (env, service, date(ts))
```
## ๐งช Testing
```bash
# Run tests
pytest
# Run with coverage
pytest --cov=finomeny_logger
# Run specific test types
pytest tests/unit
pytest tests/integration
```
## ๐ Advanced Usage
### Custom Context Blocks
```python
# Create custom domain context
def create_payment_context(payment_id, amount, currency):
return {
"payment_ctx": {
"payment_id": payment_id,
"amount": amount,
"currency": currency,
"processor": "stripe"
}
}
logger.info(
"Payment.Processed",
"Payment completed",
**create_payment_context("PAY-123", 100.00, "USD")
)
```
### Correlation Across Services
```python
# Service A
logger.info(
"Request.Started",
"Processing user request",
correlation_id="req-abc123"
)
# Service B (use same correlation_id)
logger.info(
"Data.Fetched",
"Retrieved user data",
correlation_id="req-abc123" # Same ID for tracing
)
```
### GDPR Compliance
```python
# Automatically classify and protect PII
logger.info(
"User.DataProcessed",
"User personal data processed",
debtor_id="DEBT-123", # Automatically hashed
pii_flags=["tokenized"], # Explicit PII classification
kvs={
"processing_purpose": "debt_collection",
"legal_basis": "contract",
"retention_period_days": 2555
}
)
```
## ๐๏ธ Development
### Setup Development Environment
```bash
git clone https://github.com/FinomenyTech/finomeny-logger.git
cd finomeny-logger
pip install -e ".[dev]"
pre-commit install
```
### Code Quality
```bash
# Format code
black src tests
# Sort imports
isort src tests
# Lint
flake8 src tests
# Type check
mypy src
```
## ๐ License
MIT License - see [LICENSE](LICENSE) file for details.
## ๐ค Contributing
1. Fork the repository
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request
## ๐ Support
- ๐ง Email: nesi@finomeny.es
- ๐ Issues: [GitHub Issues](https://github.com/FinomenyTech/finomeny-logger/issues)
- ๐ Documentation: [Read the Docs](https://finomeny-logger.readthedocs.io)
## ๐ Roadmap
- [ ] Elasticsearch/OpenSearch direct output
- [ ] Metrics collection integration (Prometheus)
- [ ] Custom PII detection patterns
- [ ] Log sampling for high-volume services
- [ ] Real-time log streaming
- [ ] Integration with AWS X-Ray
---
**Made with โค๏ธ by the Finomeny Engineering Team**
Raw data
{
"_id": null,
"home_page": null,
"name": "finomeny-logger",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "aws, fintech, logging, observability, structured-logging",
"author": null,
"author_email": "Finomeny Engineering <nesi@finomeny.es>",
"download_url": "https://files.pythonhosted.org/packages/d4/f0/b4fc3288f5cb7db2507a74404d96d788d06cdab43e846bf2466db3372bee/finomeny_logger-1.0.0.tar.gz",
"platform": null,
"description": "# Finomeny Logger\n\n[](https://badge.fury.io/py/finomeny-logger)\n[](https://pypi.org/project/finomeny-logger/)\n[](https://opensource.org/licenses/MIT)\n[](https://codecov.io/gh/finomeny/finomeny-logger)\n\nProduction-ready structured logging library for Finomeny services. Provides consistent, JSON-structured logs across AWS services with automatic context detection, PII redaction, and comprehensive observability features.\n\n## \ud83d\ude80 Features\n\n- **Structured JSON Logging**: Enforces consistent log format across all services\n- **AWS Auto-Detection**: Automatically detects Lambda, EC2, and other AWS contexts\n- **PII Protection**: Built-in PII detection and redaction with configurable strategies\n- **Schema Validation**: JSON Schema validation for log consistency\n- **Distributed Tracing**: Correlation IDs and trace IDs for request tracking\n- **Domain Contexts**: Pre-built context blocks for Snowflake, Salesforce, Airflow, etc.\n- **Performance Monitoring**: Built-in operation tracing with timing metrics\n- **Compliance Ready**: GDPR-compliant logging with automatic data classification\n\n## \ud83d\udce6 Installation\n\n```bash\npip install finomeny-logger\n```\n\nFor development:\n```bash\npip install finomeny-logger[dev]\n```\n\n## \ud83c\udfc3 Quick Start\n\n### Basic Usage\n\n```python\nfrom finomeny_logger import FinomenyLogger, LogCategory\n\n# Initialize logger (auto-detects AWS context)\nlogger = FinomenyLogger(\n service=\"portfolio-api\",\n component=\"lambda\", \n version=\"1.2.3\"\n)\n\n# Log business events\nlogger.info(\n \"Portfolio.Created\",\n \"New portfolio created successfully\",\n category=LogCategory.BUSINESS_TRANSACTION,\n portfolio_id=\"PORT-12345\",\n actor_id=\"user123\",\n metrics={\"portfolios_created\": 1}\n)\n\n# Log with error handling\ntry:\n risky_operation()\nexcept Exception as e:\n logger.error(\n \"Portfolio.CreationFailed\", \n \"Portfolio creation failed\",\n error=e,\n portfolio_id=\"PORT-12345\"\n )\n```\n\n### AWS Lambda Function\n\n```python\nfrom finomeny_logger import FinomenyLogger, create_ingestion_context\n\ndef lambda_handler(event, context):\n logger = FinomenyLogger(\n service=\"portfolio-ingester\",\n component=\"lambda\",\n version=\"2.1.4\"\n )\n \n portfolio_id = event['portfolio_id']\n \n # Use operation tracing for automatic timing\n with logger.trace_operation(\n \"ProcessPortfolio\",\n portfolio_id=portfolio_id,\n source_system=\"s3\",\n target_system=\"snowflake\"\n ) as tracer:\n \n # Your processing logic here\n process_portfolio_file(event['s3_key'])\n \n # Log with domain context\n logger.info(\n \"Portfolio.Processed\",\n f\"Portfolio {portfolio_id} processed successfully\",\n portfolio_id=portfolio_id,\n metrics={\"rows_processed\": 10000},\n **create_ingestion_context(\n source_type=\"xls\",\n file_key=event['s3_key']\n )\n )\n \n return {\"statusCode\": 200}\n```\n\n### Airflow Integration\n\n```python\nfrom finomeny_logger import FinomenyLogger, create_airflow_context\n\ndef my_airflow_task(**context):\n logger = FinomenyLogger(\n service=\"data-pipeline\",\n component=\"airflow\",\n version=\"1.0.0\"\n )\n \n # Extract Airflow context\n dag_id = context['dag'].dag_id\n task_id = context['task'].task_id\n run_id = context['run_id']\n \n logger.info(\n \"ETL.TaskStarted\",\n f\"Starting ETL task: {task_id}\",\n **create_airflow_context(dag_id, task_id, run_id)\n )\n \n # Your ETL logic here\n```\n\n## \ud83d\udcca Structured Log Format\n\nEvery log follows this structured format:\n\n```json\n{\n \"ts\": \"2025-09-11T09:30:15.123Z\",\n \"env\": \"prod\",\n \"service\": \"portfolio-ingester\", \n \"component\": \"lambda\",\n \"version\": \"2.1.4\",\n \"region\": \"eu-west-2\",\n \"level\": \"INFO\",\n \"category\": \"BusinessTransaction\",\n \"event\": \"Portfolio.Processed\",\n \"message\": \"Portfolio processing completed\",\n \"trace_id\": \"0f8fad5b-d9cb-469f-a165-70867728950e\",\n \"correlation_id\": \"req-8d3e1c1b9f\",\n \"portfolio_id\": \"PORT-12345\",\n \"actor_id\": \"user123\",\n \"pii_flags\": [\"none\"],\n \"metrics\": {\n \"rows_processed\": 10000,\n \"latency_ms\": 1250\n },\n \"tags\": [\"portfolio\", \"processing\"],\n \"kvs\": {\n \"file_key\": \"data/portfolio.xlsx\"\n }\n}\n```\n\n### Required Fields\n\n- `ts`: ISO 8601 timestamp\n- `env`: Environment (dev/stg/prod) \n- `service`: Service name\n- `component`: Component type (lambda/api/airflow/etc)\n- `version`: Service version\n- `region`: AWS region\n- `level`: Log level (DEBUG/INFO/WARN/ERROR/CRITICAL)\n- `category`: Event category (Security/Compliance/BusinessTransaction/Engagement/TechnicalOps)\n- `event`: Event name (PascalCase with dots)\n- `message`: Human-readable message\n- `pii_flags`: PII classification\n- `metrics`: Quantitative data\n- `kvs`: Additional key-value pairs\n\n## \ud83d\udd10 PII Protection\n\nThe logger automatically detects and redacts PII:\n\n```python\n# Automatic PII redaction\nlogger.info(\n \"User.ContactUpdated\",\n \"User updated contact: john.doe@example.com and +1-555-123-4567\",\n # Output: \"User updated contact: [REDACTED_EMAIL] and [REDACTED_PHONE]\"\n pii_flags=[\"contains-pii\"]\n)\n\n# Sensitive ID hashing\nlogger.info(\n \"Payment.Processed\", \n \"Payment processed for debtor\",\n debtor_id=\"DEBT-12345\" # Automatically hashed to protect PII\n)\n```\n\n## \ud83d\udccb Domain Contexts\n\nUse pre-built context blocks for common integrations:\n\n### Snowflake Context\n\n```python\nfrom finomeny_logger import create_snowflake_context\n\nlogger.info(\n \"Transform.Complete\",\n \"Data transformation finished\",\n **create_snowflake_context(\n query_id=\"01a12345-0400-5db1-0000-0f5c00a1bdf6\",\n rows_affected=10000,\n credit_cost_est=0.05\n )\n)\n```\n\n### Salesforce Context\n\n```python\nlogger.info(\n \"Salesforce.UpsertComplete\",\n \"Records upserted to Salesforce\",\n salesforce_ctx={\n \"api\": \"Bulk\",\n \"object\": \"Debt__c\", \n \"operation\": \"upsert\",\n \"batch_id\": \"751xxxxxxxxxxxx\",\n \"success_count\": 9950,\n \"error_count\": 50\n }\n)\n```\n\n### Ingestion Context\n\n```python\nfrom finomeny_logger import create_ingestion_context\n\nlogger.info(\n \"File.Processed\",\n \"File ingestion completed\", \n **create_ingestion_context(\n source_type=\"csv\",\n file_key=\"data/import.csv\",\n checksum=\"sha256:abc123\",\n headers_detected=True\n )\n)\n```\n\n## \ud83c\udfaf Log Categories & Levels\n\n### Categories\n\n- `Security`: Authentication, authorization, access control\n- `Compliance`: GDPR, audit trails, regulatory events \n- `BusinessTransaction`: Core business operations\n- `Engagement`: User interactions, communications\n- `TechnicalOps`: Infrastructure, performance, errors\n\n### Levels\n\n- `DEBUG`: Detailed diagnostic information\n- `INFO`: General operational messages\n- `WARN`: Warning conditions that should be addressed\n- `ERROR`: Error conditions that don't stop operation\n- `CRITICAL`: Critical errors requiring immediate attention\n\n## \u26a1 Performance Monitoring\n\nBuilt-in operation tracing with automatic timing:\n\n```python\n# Automatic timing and error handling\nwith logger.trace_operation(\n \"DatabaseQuery\", \n portfolio_id=\"PORT-123\",\n metrics={\"query_complexity\": \"high\"}\n) as tracer:\n \n # Your operation here\n result = execute_complex_query()\n \n # Timing automatically added to metrics\n # Errors automatically logged if exception occurs\n```\n\n## \ud83d\udd27 Configuration\n\n### Environment Variables\n\nThe logger auto-detects configuration from environment:\n\n- `ENVIRONMENT` / `ENV` / `STAGE`: Environment name\n- `AWS_REGION`: AWS region\n- `AWS_LAMBDA_FUNCTION_NAME`: Lambda function name\n- `_X_AMZN_TRACE_ID`: AWS request tracing\n\n### Initialization Options\n\n```python\nlogger = FinomenyLogger(\n service=\"my-service\",\n component=\"lambda\",\n version=\"1.0.0\",\n env=\"prod\", # Override auto-detection\n region=\"us-east-1\", # Override auto-detection \n auto_detect_aws=True, # Enable AWS context detection\n validate_schema=True, # Enable JSON schema validation\n redact_pii=True, # Enable PII redaction\n max_error_stack_size=8192 # Limit error stack size\n)\n```\n\n## \ud83d\udcc8 AWS Integration\n\n### CloudWatch Logs\n\nLogs automatically flow to CloudWatch when running in Lambda:\n\n```python\n# In Lambda, logs go directly to CloudWatch\nlogger.info(\"Lambda.Started\", \"Function execution started\")\n```\n\n### S3 Data Lake\n\nConfigure EventBridge rules to route logs to S3:\n\n```\nPattern: s3://logs/{env}/{service}/dt=YYYY-MM-DD/region={region}/\n```\n\n### OpenSearch Indexing\n\nRecommended index pattern:\n\n```\nIndex: {env}-{service}-yyyy.mm.dd\nPartition: (env, service, date(ts))\n```\n\n## \ud83e\uddea Testing\n\n```bash\n# Run tests\npytest\n\n# Run with coverage\npytest --cov=finomeny_logger\n\n# Run specific test types\npytest tests/unit\npytest tests/integration\n```\n\n## \ud83d\udcda Advanced Usage\n\n### Custom Context Blocks\n\n```python\n# Create custom domain context\ndef create_payment_context(payment_id, amount, currency):\n return {\n \"payment_ctx\": {\n \"payment_id\": payment_id,\n \"amount\": amount, \n \"currency\": currency,\n \"processor\": \"stripe\"\n }\n }\n\nlogger.info(\n \"Payment.Processed\",\n \"Payment completed\",\n **create_payment_context(\"PAY-123\", 100.00, \"USD\")\n)\n```\n\n### Correlation Across Services\n\n```python\n# Service A\nlogger.info(\n \"Request.Started\", \n \"Processing user request\",\n correlation_id=\"req-abc123\"\n)\n\n# Service B (use same correlation_id)\nlogger.info(\n \"Data.Fetched\",\n \"Retrieved user data\", \n correlation_id=\"req-abc123\" # Same ID for tracing\n)\n```\n\n### GDPR Compliance\n\n```python\n# Automatically classify and protect PII\nlogger.info(\n \"User.DataProcessed\",\n \"User personal data processed\",\n debtor_id=\"DEBT-123\", # Automatically hashed\n pii_flags=[\"tokenized\"], # Explicit PII classification\n kvs={\n \"processing_purpose\": \"debt_collection\",\n \"legal_basis\": \"contract\",\n \"retention_period_days\": 2555\n }\n)\n```\n\n## \ud83c\udfd7\ufe0f Development\n\n### Setup Development Environment\n\n```bash\ngit clone https://github.com/FinomenyTech/finomeny-logger.git\ncd finomeny-logger\npip install -e \".[dev]\"\npre-commit install\n```\n\n### Code Quality\n\n```bash\n# Format code\nblack src tests\n\n# Sort imports \nisort src tests\n\n# Lint\nflake8 src tests\n\n# Type check\nmypy src\n```\n\n## \ud83d\udcc4 License\n\nMIT License - see [LICENSE](LICENSE) file for details.\n\n## \ud83e\udd1d Contributing\n\n1. Fork the repository\n2. Create a feature branch (`git checkout -b feature/amazing-feature`)\n3. Commit your changes (`git commit -m 'Add amazing feature'`)\n4. Push to the branch (`git push origin feature/amazing-feature`)\n5. Open a Pull Request\n\n## \ud83d\udcde Support\n\n- \ud83d\udce7 Email: nesi@finomeny.es\n- \ud83d\udc1b Issues: [GitHub Issues](https://github.com/FinomenyTech/finomeny-logger/issues)\n- \ud83d\udcda Documentation: [Read the Docs](https://finomeny-logger.readthedocs.io)\n\n## \ud83d\ude80 Roadmap\n\n- [ ] Elasticsearch/OpenSearch direct output\n- [ ] Metrics collection integration (Prometheus)\n- [ ] Custom PII detection patterns\n- [ ] Log sampling for high-volume services\n- [ ] Real-time log streaming\n- [ ] Integration with AWS X-Ray\n\n---\n\n**Made with \u2764\ufe0f by the Finomeny Engineering Team**",
"bugtrack_url": null,
"license": null,
"summary": "Production-ready structured logging library for Finomeny services",
"version": "1.0.0",
"project_urls": {
"Changelog": "https://github.com/FinomenyTech/finomeny-logger/blob/main/CHANGELOG.md",
"Documentation": "https://finomeny-logger.readthedocs.io",
"Homepage": "https://github.com/FinomenyTech/finomeny-logger",
"Issues": "https://github.com/FinomenyTech/finomeny-logger/issues",
"Repository": "https://github.com/FinomenyTech/finomeny-logger"
},
"split_keywords": [
"aws",
" fintech",
" logging",
" observability",
" structured-logging"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "7da4b50e2556aa5b4a016bebddb6bf79ea53dc04becfd22202f3af1168957bd0",
"md5": "5df2ca524e75e6e84dde18fe8ca4424b",
"sha256": "51dc9c59481c79a4105afd581bdfd0a669dd18a525535dc1aeefca793cc8e4b1"
},
"downloads": -1,
"filename": "finomeny_logger-1.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5df2ca524e75e6e84dde18fe8ca4424b",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 15774,
"upload_time": "2025-09-12T08:16:37",
"upload_time_iso_8601": "2025-09-12T08:16:37.250771Z",
"url": "https://files.pythonhosted.org/packages/7d/a4/b50e2556aa5b4a016bebddb6bf79ea53dc04becfd22202f3af1168957bd0/finomeny_logger-1.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "d4f0b4fc3288f5cb7db2507a74404d96d788d06cdab43e846bf2466db3372bee",
"md5": "41388a14777f09924a0c86007fd5754d",
"sha256": "e608f79616da02898096f947fde600718e66b7cba3ed3266b4439b85cac28c74"
},
"downloads": -1,
"filename": "finomeny_logger-1.0.0.tar.gz",
"has_sig": false,
"md5_digest": "41388a14777f09924a0c86007fd5754d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 22950,
"upload_time": "2025-09-12T08:16:38",
"upload_time_iso_8601": "2025-09-12T08:16:38.618000Z",
"url": "https://files.pythonhosted.org/packages/d4/f0/b4fc3288f5cb7db2507a74404d96d788d06cdab43e846bf2466db3372bee/finomeny_logger-1.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-12 08:16:38",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "FinomenyTech",
"github_project": "finomeny-logger",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "finomeny-logger"
}