mentorstec


Namementorstec JSON
Version 1.0.2 PyPI version JSON
download
home_pageNone
SummaryMentorstec - centralized event logging and monitoring
upload_time2025-08-12 00:53:08
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseNone
keywords azure event-hub exceptions logging mentorstec monitoring
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Mentorstec

![Pipeline](https://github.com/Mentorstec/mentorstec/actions/workflows/pipeline.yml/badge.svg)
![PyPI Version](https://img.shields.io/pypi/v/mentorstec?color=blue&logo=pypi&logoColor=white)
![Python Versions](https://img.shields.io/pypi/pyversions/mentorstec?logo=python&logoColor=white)
![License](https://img.shields.io/github/license/Mentorstec/mentorstec?color=green)
![Downloads](https://img.shields.io/pypi/dm/mentorstec?color=orange&logo=pypi)

Plataforma centralizada de dados e logging com Repository Pattern. Suporte para Azure Service Bus, Dremio e Power BI com arquitetura modular e extensível.

## 🚀 Instalação

### Instalação Básica
```bash
pip install mentorstec
```

### Com Módulos Opcionais
```bash
# Power BI support
pip install mentorstec[powerbi]

# Dremio support
pip install mentorstec[dremio]

# Todas as funcionalidades
pip install mentorstec[powerbi,dremio]

# Desenvolvimento
pip install mentorstec[dev]
```

### Desenvolvimento Local
```bash
git clone https://github.com/mentorstec/mentorstec.git
cd mentorstec
pip install -e ".[dev,powerbi,dremio]"
```

## 🏗️ Arquitetura Completa

O Mentorstec implementa **Repository Pattern** para máxima flexibilidade e extensibilidade:

```
mentorstec/
├── eventhub/                     # 🔄 Azure Service Bus Integration
│   ├── event_hub.py              #   - Funções globais de conveniência
│   └── event_hub_client.py       #   - Cliente principal para eventos
│
├── dremio/                       # 🗄️ Data Virtualization (Opcional)
│   └── dremio.py                 #   - Cliente para consultas SQL
│
├── powerbi/                      # 📊 Power BI Integration (Opcional)
│   └── powerbi.py                #   - Refresh de datasets e dataflows
│
├── repository/                   # 🏛️ Abstract Interfaces
│   ├── eventhub_repository.py    #   - Interface para event logging
│   ├── dremio_repository.py      #   - Interface para data queries
│   └── powerbi_repository.py     #   - Interface para Power BI ops
│
├── azure/                        # ☁️ Azure Implementations
│   └── azure_service_bus_repository.py  # - Service Bus concrete impl
│
└── lakehouse/                    # 🏠 Reserved for Future Use
    └── __init__.py               #   - Preparado para expansão
```

### 🎯 Design Principles

- **Repository Pattern**: Interfaces abstratas com implementações concretas
- **Optional Dependencies**: Módulos carregados apenas se dependências existem
- **Backward Compatibility**: APIs antigas mantidas para compatibilidade
- **Extensibility**: Fácil adição de novos módulos e provedores
- **Production Ready**: Testes, CI/CD e distribuição automatizada

## 📋 Configuração

### Azure Service Bus (Core)
```bash
export AZURE_SERVICE_BUS_CONNECTION_STRING="Endpoint=sb://..."
```

### Power BI (Opcional)
Configuração via variáveis do Airflow ou configuração manual:
```python
project_config = [{
    'api': {
        'authority': 'https://login.microsoftonline.com/your-tenant-id',
        'resource': 'https://analysis.windows.net/powerbi/api',
        'powerbi': 'https://api.powerbi.com/v1.0/myorg'
    },
    'credentials': {
        'client': 'your-client-id',
        'usr': 'service-account@company.com',
        'pass': 'service-account-password'
    }
}]
```

### Dremio (Opcional)
```python
dremio_config = {
    'host': 'your-dremio-host',
    'port': 9047,
    'username': 'your-username',
    'password': 'your-password'
}
```

## 🎯 Exemplos de Uso

### 🔄 EventHub - Azure Service Bus

#### Client Direto (Recomendado)
```python
from mentorstec import EventHubClient

# Criar client - queue_name agora é obrigatório
client = EventHubClient.create_azure_client("meu-projeto", "events-queue", layer="web")

# Enviar evento
client.send_event(
    event_type="USER_LOGIN",
    message="Usuário fez login com sucesso",
    object="auth_service",
    tags=["auth", "success", "production"],
    user_id=12345,
    session_id="sess_abc123",
    ip_address="192.168.1.100"
)

# Capturar erros automaticamente
@client.capture_errors("payment_process")
def process_payment(amount, currency="USD"):
    if amount <= 0:
        raise ValueError("Valor inválido para pagamento")
    if currency not in ["USD", "BRL", "EUR"]:
        raise ValueError("Moeda não suportada")
    return {"status": "success", "amount": amount}

# Enviar erro contextualizado
try:
    process_payment(-100)
except Exception as e:
    client.send_error(
        e, 
        context="payment_validation",
        user_id=12345,
        order_id="ord_789",
        additional_data={"attempted_amount": -100}
    )
```

#### Handler Global de Exceções
```python
import sys
from mentorstec import EventHubClient

client = EventHubClient.create_azure_client("production-app", "critical-events", "global")

def global_exception_handler(exc_type, exc_value, exc_traceback):
    import traceback
    client.send_event(
        event_type="CRITICAL_ERROR",
        message=str(exc_value),
        obs="".join(traceback.format_exception(exc_type, exc_value, exc_traceback)),
        object="uncaught_exception",
        tags=["critical", "uncaught", exc_type.__name__],
        severity="HIGH"
    )
    sys.__excepthook__(exc_type, exc_value, exc_traceback)

sys.excepthook = global_exception_handler
```

### 🗄️ Dremio - Data Virtualization

```python
from mentorstec import Dremio

# Inicializar cliente Dremio
dremio = Dremio(
    host="dremio.company.com",
    port=9047,
    username="data_analyst",
    password="secure_password"
)

# Autenticar
if dremio.authenticate():
    print("✅ Conectado ao Dremio com sucesso")
    
    # Consulta SQL genérica - método central
    sales_data = dremio.execute_sql("""
        SELECT 
            region,
            product_category,
            SUM(revenue) as total_revenue,
            COUNT(orders) as total_orders,
            AVG(order_value) as avg_order_value
        FROM sales.fact_orders 
        WHERE order_date >= '2024-01-01'
        GROUP BY region, product_category
        ORDER BY total_revenue DESC
        LIMIT 100
    """)
    
    if sales_data:
        print(f"📊 Encontrados {len(sales_data.get('rows', []))} registros")
        
    # Listar tabelas em um path
    tables = dremio.get_tables("sales")
    print(f"📁 Tabelas disponíveis: {tables}")
    
    # Obter metadados de uma tabela
    metadata = dremio.get_table_metadata("sales.fact_orders")
    if metadata:
        print(f"🔍 Colunas: {[col['name'] for col in metadata.get('columns', [])]}")
    
    # Query com parâmetros para análise avançada
    customer_analysis = dremio.execute_sql("""
        WITH customer_metrics AS (
            SELECT 
                customer_id,
                COUNT(DISTINCT order_id) as total_orders,
                SUM(order_value) as lifetime_value,
                AVG(order_value) as avg_order_value,
                MAX(order_date) as last_order_date
            FROM sales.fact_orders 
            WHERE customer_id IS NOT NULL
            GROUP BY customer_id
        )
        SELECT 
            CASE 
                WHEN lifetime_value > 10000 THEN 'VIP'
                WHEN lifetime_value > 5000 THEN 'Premium'
                WHEN lifetime_value > 1000 THEN 'Regular'
                ELSE 'New'
            END as customer_tier,
            COUNT(*) as customer_count,
            AVG(lifetime_value) as avg_lifetime_value,
            AVG(total_orders) as avg_orders_per_customer
        FROM customer_metrics
        GROUP BY 1
        ORDER BY avg_lifetime_value DESC
    """)
```

### 📊 Power BI - Datasets e Dataflows

```python
from mentorstec import PowerBi

# Inicializar com variável do Airflow
powerbi = PowerBi("powerbi_production_config")

# Ou configuração manual
project_config = [{
    'api': {
        'authority': 'https://login.microsoftonline.com/your-tenant-id',
        'resource': 'https://analysis.windows.net/powerbi/api',
        'powerbi': 'https://api.powerbi.com/v1.0/myorg'
    },
    'credentials': {
        'client': 'your-application-client-id',
        'usr': 'powerbi-service@company.com',
        'pass': 'service-account-password'
    },
    'datasets': [{
        'group_id': 'f089354e-8366-4e18-aea3-4cb4a3a50b48',
        'datasets': [
            {'tag': 'sales_dashboard', 'id': 'cfafbeb1-8037-4d0c-896e-a46fb27ff229'},
            {'tag': 'financial_report', 'id': '3d9b93c6-7b6d-4801-a491-1738910bd1b0'},
            {'tag': 'customer_analytics', 'id': 'a1b2c3d4-e5f6-7890-ghij-klmnopqrstuv'}
        ]
    }],
    'dataflows': [{
        'group_id': 'f089354e-8366-4e18-aea3-4cb4a3a50b48',
        'dataflows': [
            {'tag': 'customer_etl', 'id': '1234abcd-ef56-7890-ghij-klmnopqrstuv'},
            {'tag': 'sales_etl', 'id': 'abcd1234-5678-90ef-ghij-klmnopqrstuv'}
        ]
    }]
}]

# Refresh de dataset específico
success = powerbi.refresh_dataset(
    tag='sales_dashboard',
    project_config=project_config,
    payload_json={
        "type": "full",
        "commitMode": "transactional",
        "applyRefreshPolicy": "false"
    }
)

if success:
    print("✅ Dashboard de vendas atualizado com sucesso")

# Refresh de dataflow
etl_success = powerbi.refresh_dataflow(
    tag='customer_etl',
    project_config=project_config
)

# Refresh em lote de todos os datasets
results = powerbi.refresh_all_datasets(project_config)
print("📊 Resultados do refresh em lote:")
for dataset_tag, result in results.items():
    status = "✅ Sucesso" if result else "❌ Falhou"
    print(f"  {dataset_tag}: {status}")

# Operação completa com controle manual
token = powerbi.get_token(project_config)
if token:
    urls = powerbi.generate_url('datasets', project_config)
    
    for tag, url in urls.items():
        print(f"🔄 Processando {tag}...")
        
        # Executa refresh e aguarda conclusão
        success = powerbi.execute(
            url=url,
            token=token,
            payload_json={
                "type": "full", 
                "commitMode": "transactional",
                "applyRefreshPolicy": "false"
            }
        )
        
        if success:
            print(f"  ✅ {tag} concluído")
        else:
            print(f"  ❌ {tag} falhou")
```

## 🎯 Integração Combinada

### Pipeline de Dados Completo
```python
from mentorstec import EventHubClient, Dremio, PowerBi
import logging

# Configurar logging
logging.basicConfig(level=logging.INFO)

# Inicializar clientes
event_client = EventHubClient.create_azure_client("data-pipeline", "pipeline-events", "etl")
dremio = Dremio(host="dremio.company.com", port=9047, username="etl_user", password="password")
powerbi = PowerBi("powerbi_etl_config")

def run_daily_pipeline():
    """Pipeline diário de dados completo"""
    
    try:
        # 1. Log início do pipeline
        event_client.send_event(
            event_type="PIPELINE_START",
            message="Iniciando pipeline diário de dados",
            object="daily_etl",
            tags=["pipeline", "etl", "daily"]
        )
        
        # 2. Extrair dados do Dremio
        if dremio.authenticate():
            daily_metrics = dremio.execute_sql("""
                SELECT 
                    DATE(order_date) as date,
                    SUM(revenue) as daily_revenue,
                    COUNT(*) as daily_orders,
                    COUNT(DISTINCT customer_id) as unique_customers
                FROM sales.fact_orders 
                WHERE order_date >= CURRENT_DATE - INTERVAL '1' DAY
                GROUP BY 1
            """)
            
            if daily_metrics and daily_metrics.get('rows'):
                event_client.send_event(
                    event_type="DATA_EXTRACTED",
                    message=f"Extraídos {len(daily_metrics['rows'])} registros do Dremio",
                    object="dremio_extraction",
                    tags=["success", "extraction"]
                )
            else:
                raise Exception("Nenhum dado encontrado no Dremio")
        else:
            raise Exception("Falha na autenticação do Dremio")
        
        # 3. Atualizar Power BI
        projects = powerbi.get_power_bi_projects()
        refresh_results = powerbi.refresh_all_datasets(projects)
        
        successful_refreshes = sum(1 for success in refresh_results.values() if success)
        total_datasets = len(refresh_results)
        
        if successful_refreshes == total_datasets:
            event_client.send_event(
                event_type="POWERBI_SUCCESS",
                message=f"Todos os {total_datasets} datasets atualizados com sucesso",
                object="powerbi_refresh",
                tags=["success", "powerbi", "complete"]
            )
        else:
            event_client.send_event(
                event_type="POWERBI_PARTIAL",
                message=f"{successful_refreshes}/{total_datasets} datasets atualizados",
                object="powerbi_refresh",
                tags=["warning", "powerbi", "partial"],
                failed_datasets=[tag for tag, success in refresh_results.items() if not success]
            )
        
        # 4. Log sucesso do pipeline
        event_client.send_event(
            event_type="PIPELINE_SUCCESS",
            message="Pipeline diário concluído com sucesso",
            object="daily_etl",
            tags=["pipeline", "success", "complete"],
            datasets_processed=total_datasets,
            records_extracted=len(daily_metrics.get('rows', []))
        )
        
    except Exception as e:
        # Log erro detalhado
        event_client.send_error(
            e,
            context="daily_pipeline",
            tags=["pipeline", "failure", "critical"],
            pipeline_stage="data_extraction"
        )
        raise

# Executar pipeline
if __name__ == "__main__":
    run_daily_pipeline()
```

## 📊 Estrutura do Payload

### EventHub - Formato Padrão
```json
{
    "project": "meu-projeto",
    "layer": "web",
    "queue_name": "events-queue",
    "message": "Usuário fez login",
    "obs": "",
    "timestamp": "2025-01-11T10:30:45.123456Z",
    "event_type": "USER_LOGIN", 
    "object": "auth_service",
    "tags": ["auth", "success", "production"],
    "user_id": 12345,
    "session_id": "sess_abc123"
}
```

## 🧪 Testes e Qualidade

```bash
# Executar suite completa de testes
./run_tests.sh

# Apenas testes unitários
pytest tests/ -v --cov=mentorstec

# Formatação e lint
black mentorstec/
ruff check mentorstec/
mypy mentorstec/ --ignore-missing-imports
```

## 🚀 Deploy e CI/CD

### Deploy Automático
O projeto possui pipeline completa de CI/CD:

- **Push/PR para `main`**: Executa testes e deploy automático
- **TestPyPI**: Deploy automático em PRs
- **PyPI**: Deploy automático em merge para main

### Configuração de Secrets
Configure no GitHub os tokens:
- `PYPI_TOKEN`: Token do PyPI oficial
- `PYPI_TEST_TOKEN`: Token do TestPyPI

### Versionamento
Incrementar versão em `mentorstec/__init__.py`:
```python
__version__ = "0.1.4"  # Próxima versão
```

## 🔧 Extensibilidade

### Adicionando Novos Módulos
```python
# Exemplo: novo módulo para Snowflake
from mentorstec import EventHubClient

class SnowflakeClient:
    def __init__(self, connection_params):
        self.event_client = EventHubClient.create_azure_client(
            "snowflake-ops", "snowflake-events", "data"
        )
        # Sua implementação aqui
    
    def execute_query(self, query):
        try:
            # Lógica de execução
            result = self._execute(query)
            
            self.event_client.send_event(
                event_type="QUERY_SUCCESS",
                message="Query executada com sucesso",
                object="snowflake_query",
                tags=["snowflake", "success"],
                query_hash=hash(query),
                row_count=len(result)
            )
            return result
            
        except Exception as e:
            self.event_client.send_error(
                e, context="snowflake_query",
                query=query[:100]  # Primeiros 100 chars
            )
            raise
```

## 📚 Documentação Completa

- **Exemplos Avançados**: `/examples/powerbi_usage_examples.py`
- **Guias de Deploy**: `/docs/DEPLOY_GUIDE.md`
- **Guias de Teste**: `/docs/TESTING_GUIDE.md`
- **Configuração de Secrets**: `/.github/SECRETS.md`

## 📄 Licença

MIT License - veja [LICENSE](LICENSE) para detalhes.

## 🤝 Contribuindo

1. Fork o projeto
2. Crie sua feature branch (`git checkout -b feature/nova-feature`)
3. Execute os testes (`./run_tests.sh`)
4. Commit suas mudanças (`git commit -m 'Add nova feature'`)
5. Push para a branch (`git push origin feature/nova-feature`)
6. Abra um Pull Request

## 📞 Suporte

- **Email**: diego@mentorstec.com.br
- **Issues**: [GitHub Issues](https://github.com/mentorstec/mentorstec/issues)
- **Documentação**: [GitHub Wiki](https://github.com/mentorstec/mentorstec/wiki)

---

**Mentorstec v0.1.3** - Plataforma completa de dados e logging com arquitetura modular 🚀
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "mentorstec",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "azure, event-hub, exceptions, logging, mentorstec, monitoring",
    "author": null,
    "author_email": "Mentorstec <diego@mentorstec.com.br>",
    "download_url": "https://files.pythonhosted.org/packages/89/1a/9a6e8903ab7be458a904acf83c427f562e2b39197ea7f4be65088bfaa1af/mentorstec-1.0.2.tar.gz",
    "platform": null,
    "description": "# Mentorstec\n\n![Pipeline](https://github.com/Mentorstec/mentorstec/actions/workflows/pipeline.yml/badge.svg)\n![PyPI Version](https://img.shields.io/pypi/v/mentorstec?color=blue&logo=pypi&logoColor=white)\n![Python Versions](https://img.shields.io/pypi/pyversions/mentorstec?logo=python&logoColor=white)\n![License](https://img.shields.io/github/license/Mentorstec/mentorstec?color=green)\n![Downloads](https://img.shields.io/pypi/dm/mentorstec?color=orange&logo=pypi)\n\nPlataforma centralizada de dados e logging com Repository Pattern. Suporte para Azure Service Bus, Dremio e Power BI com arquitetura modular e extens\u00edvel.\n\n## \ud83d\ude80 Instala\u00e7\u00e3o\n\n### Instala\u00e7\u00e3o B\u00e1sica\n```bash\npip install mentorstec\n```\n\n### Com M\u00f3dulos Opcionais\n```bash\n# Power BI support\npip install mentorstec[powerbi]\n\n# Dremio support\npip install mentorstec[dremio]\n\n# Todas as funcionalidades\npip install mentorstec[powerbi,dremio]\n\n# Desenvolvimento\npip install mentorstec[dev]\n```\n\n### Desenvolvimento Local\n```bash\ngit clone https://github.com/mentorstec/mentorstec.git\ncd mentorstec\npip install -e \".[dev,powerbi,dremio]\"\n```\n\n## \ud83c\udfd7\ufe0f Arquitetura Completa\n\nO Mentorstec implementa **Repository Pattern** para m\u00e1xima flexibilidade e extensibilidade:\n\n```\nmentorstec/\n\u251c\u2500\u2500 eventhub/                     # \ud83d\udd04 Azure Service Bus Integration\n\u2502   \u251c\u2500\u2500 event_hub.py              #   - Fun\u00e7\u00f5es globais de conveni\u00eancia\n\u2502   \u2514\u2500\u2500 event_hub_client.py       #   - Cliente principal para eventos\n\u2502\n\u251c\u2500\u2500 dremio/                       # \ud83d\uddc4\ufe0f Data Virtualization (Opcional)\n\u2502   \u2514\u2500\u2500 dremio.py                 #   - Cliente para consultas SQL\n\u2502\n\u251c\u2500\u2500 powerbi/                      # \ud83d\udcca Power BI Integration (Opcional)\n\u2502   \u2514\u2500\u2500 powerbi.py                #   - Refresh de datasets e dataflows\n\u2502\n\u251c\u2500\u2500 repository/                   # \ud83c\udfdb\ufe0f Abstract Interfaces\n\u2502   \u251c\u2500\u2500 eventhub_repository.py    #   - Interface para event logging\n\u2502   \u251c\u2500\u2500 dremio_repository.py      #   - Interface para data queries\n\u2502   \u2514\u2500\u2500 powerbi_repository.py     #   - Interface para Power BI ops\n\u2502\n\u251c\u2500\u2500 azure/                        # \u2601\ufe0f Azure Implementations\n\u2502   \u2514\u2500\u2500 azure_service_bus_repository.py  # - Service Bus concrete impl\n\u2502\n\u2514\u2500\u2500 lakehouse/                    # \ud83c\udfe0 Reserved for Future Use\n    \u2514\u2500\u2500 __init__.py               #   - Preparado para expans\u00e3o\n```\n\n### \ud83c\udfaf Design Principles\n\n- **Repository Pattern**: Interfaces abstratas com implementa\u00e7\u00f5es concretas\n- **Optional Dependencies**: M\u00f3dulos carregados apenas se depend\u00eancias existem\n- **Backward Compatibility**: APIs antigas mantidas para compatibilidade\n- **Extensibility**: F\u00e1cil adi\u00e7\u00e3o de novos m\u00f3dulos e provedores\n- **Production Ready**: Testes, CI/CD e distribui\u00e7\u00e3o automatizada\n\n## \ud83d\udccb Configura\u00e7\u00e3o\n\n### Azure Service Bus (Core)\n```bash\nexport AZURE_SERVICE_BUS_CONNECTION_STRING=\"Endpoint=sb://...\"\n```\n\n### Power BI (Opcional)\nConfigura\u00e7\u00e3o via vari\u00e1veis do Airflow ou configura\u00e7\u00e3o manual:\n```python\nproject_config = [{\n    'api': {\n        'authority': 'https://login.microsoftonline.com/your-tenant-id',\n        'resource': 'https://analysis.windows.net/powerbi/api',\n        'powerbi': 'https://api.powerbi.com/v1.0/myorg'\n    },\n    'credentials': {\n        'client': 'your-client-id',\n        'usr': 'service-account@company.com',\n        'pass': 'service-account-password'\n    }\n}]\n```\n\n### Dremio (Opcional)\n```python\ndremio_config = {\n    'host': 'your-dremio-host',\n    'port': 9047,\n    'username': 'your-username',\n    'password': 'your-password'\n}\n```\n\n## \ud83c\udfaf Exemplos de Uso\n\n### \ud83d\udd04 EventHub - Azure Service Bus\n\n#### Client Direto (Recomendado)\n```python\nfrom mentorstec import EventHubClient\n\n# Criar client - queue_name agora \u00e9 obrigat\u00f3rio\nclient = EventHubClient.create_azure_client(\"meu-projeto\", \"events-queue\", layer=\"web\")\n\n# Enviar evento\nclient.send_event(\n    event_type=\"USER_LOGIN\",\n    message=\"Usu\u00e1rio fez login com sucesso\",\n    object=\"auth_service\",\n    tags=[\"auth\", \"success\", \"production\"],\n    user_id=12345,\n    session_id=\"sess_abc123\",\n    ip_address=\"192.168.1.100\"\n)\n\n# Capturar erros automaticamente\n@client.capture_errors(\"payment_process\")\ndef process_payment(amount, currency=\"USD\"):\n    if amount <= 0:\n        raise ValueError(\"Valor inv\u00e1lido para pagamento\")\n    if currency not in [\"USD\", \"BRL\", \"EUR\"]:\n        raise ValueError(\"Moeda n\u00e3o suportada\")\n    return {\"status\": \"success\", \"amount\": amount}\n\n# Enviar erro contextualizado\ntry:\n    process_payment(-100)\nexcept Exception as e:\n    client.send_error(\n        e, \n        context=\"payment_validation\",\n        user_id=12345,\n        order_id=\"ord_789\",\n        additional_data={\"attempted_amount\": -100}\n    )\n```\n\n#### Handler Global de Exce\u00e7\u00f5es\n```python\nimport sys\nfrom mentorstec import EventHubClient\n\nclient = EventHubClient.create_azure_client(\"production-app\", \"critical-events\", \"global\")\n\ndef global_exception_handler(exc_type, exc_value, exc_traceback):\n    import traceback\n    client.send_event(\n        event_type=\"CRITICAL_ERROR\",\n        message=str(exc_value),\n        obs=\"\".join(traceback.format_exception(exc_type, exc_value, exc_traceback)),\n        object=\"uncaught_exception\",\n        tags=[\"critical\", \"uncaught\", exc_type.__name__],\n        severity=\"HIGH\"\n    )\n    sys.__excepthook__(exc_type, exc_value, exc_traceback)\n\nsys.excepthook = global_exception_handler\n```\n\n### \ud83d\uddc4\ufe0f Dremio - Data Virtualization\n\n```python\nfrom mentorstec import Dremio\n\n# Inicializar cliente Dremio\ndremio = Dremio(\n    host=\"dremio.company.com\",\n    port=9047,\n    username=\"data_analyst\",\n    password=\"secure_password\"\n)\n\n# Autenticar\nif dremio.authenticate():\n    print(\"\u2705 Conectado ao Dremio com sucesso\")\n    \n    # Consulta SQL gen\u00e9rica - m\u00e9todo central\n    sales_data = dremio.execute_sql(\"\"\"\n        SELECT \n            region,\n            product_category,\n            SUM(revenue) as total_revenue,\n            COUNT(orders) as total_orders,\n            AVG(order_value) as avg_order_value\n        FROM sales.fact_orders \n        WHERE order_date >= '2024-01-01'\n        GROUP BY region, product_category\n        ORDER BY total_revenue DESC\n        LIMIT 100\n    \"\"\")\n    \n    if sales_data:\n        print(f\"\ud83d\udcca Encontrados {len(sales_data.get('rows', []))} registros\")\n        \n    # Listar tabelas em um path\n    tables = dremio.get_tables(\"sales\")\n    print(f\"\ud83d\udcc1 Tabelas dispon\u00edveis: {tables}\")\n    \n    # Obter metadados de uma tabela\n    metadata = dremio.get_table_metadata(\"sales.fact_orders\")\n    if metadata:\n        print(f\"\ud83d\udd0d Colunas: {[col['name'] for col in metadata.get('columns', [])]}\")\n    \n    # Query com par\u00e2metros para an\u00e1lise avan\u00e7ada\n    customer_analysis = dremio.execute_sql(\"\"\"\n        WITH customer_metrics AS (\n            SELECT \n                customer_id,\n                COUNT(DISTINCT order_id) as total_orders,\n                SUM(order_value) as lifetime_value,\n                AVG(order_value) as avg_order_value,\n                MAX(order_date) as last_order_date\n            FROM sales.fact_orders \n            WHERE customer_id IS NOT NULL\n            GROUP BY customer_id\n        )\n        SELECT \n            CASE \n                WHEN lifetime_value > 10000 THEN 'VIP'\n                WHEN lifetime_value > 5000 THEN 'Premium'\n                WHEN lifetime_value > 1000 THEN 'Regular'\n                ELSE 'New'\n            END as customer_tier,\n            COUNT(*) as customer_count,\n            AVG(lifetime_value) as avg_lifetime_value,\n            AVG(total_orders) as avg_orders_per_customer\n        FROM customer_metrics\n        GROUP BY 1\n        ORDER BY avg_lifetime_value DESC\n    \"\"\")\n```\n\n### \ud83d\udcca Power BI - Datasets e Dataflows\n\n```python\nfrom mentorstec import PowerBi\n\n# Inicializar com vari\u00e1vel do Airflow\npowerbi = PowerBi(\"powerbi_production_config\")\n\n# Ou configura\u00e7\u00e3o manual\nproject_config = [{\n    'api': {\n        'authority': 'https://login.microsoftonline.com/your-tenant-id',\n        'resource': 'https://analysis.windows.net/powerbi/api',\n        'powerbi': 'https://api.powerbi.com/v1.0/myorg'\n    },\n    'credentials': {\n        'client': 'your-application-client-id',\n        'usr': 'powerbi-service@company.com',\n        'pass': 'service-account-password'\n    },\n    'datasets': [{\n        'group_id': 'f089354e-8366-4e18-aea3-4cb4a3a50b48',\n        'datasets': [\n            {'tag': 'sales_dashboard', 'id': 'cfafbeb1-8037-4d0c-896e-a46fb27ff229'},\n            {'tag': 'financial_report', 'id': '3d9b93c6-7b6d-4801-a491-1738910bd1b0'},\n            {'tag': 'customer_analytics', 'id': 'a1b2c3d4-e5f6-7890-ghij-klmnopqrstuv'}\n        ]\n    }],\n    'dataflows': [{\n        'group_id': 'f089354e-8366-4e18-aea3-4cb4a3a50b48',\n        'dataflows': [\n            {'tag': 'customer_etl', 'id': '1234abcd-ef56-7890-ghij-klmnopqrstuv'},\n            {'tag': 'sales_etl', 'id': 'abcd1234-5678-90ef-ghij-klmnopqrstuv'}\n        ]\n    }]\n}]\n\n# Refresh de dataset espec\u00edfico\nsuccess = powerbi.refresh_dataset(\n    tag='sales_dashboard',\n    project_config=project_config,\n    payload_json={\n        \"type\": \"full\",\n        \"commitMode\": \"transactional\",\n        \"applyRefreshPolicy\": \"false\"\n    }\n)\n\nif success:\n    print(\"\u2705 Dashboard de vendas atualizado com sucesso\")\n\n# Refresh de dataflow\netl_success = powerbi.refresh_dataflow(\n    tag='customer_etl',\n    project_config=project_config\n)\n\n# Refresh em lote de todos os datasets\nresults = powerbi.refresh_all_datasets(project_config)\nprint(\"\ud83d\udcca Resultados do refresh em lote:\")\nfor dataset_tag, result in results.items():\n    status = \"\u2705 Sucesso\" if result else \"\u274c Falhou\"\n    print(f\"  {dataset_tag}: {status}\")\n\n# Opera\u00e7\u00e3o completa com controle manual\ntoken = powerbi.get_token(project_config)\nif token:\n    urls = powerbi.generate_url('datasets', project_config)\n    \n    for tag, url in urls.items():\n        print(f\"\ud83d\udd04 Processando {tag}...\")\n        \n        # Executa refresh e aguarda conclus\u00e3o\n        success = powerbi.execute(\n            url=url,\n            token=token,\n            payload_json={\n                \"type\": \"full\", \n                \"commitMode\": \"transactional\",\n                \"applyRefreshPolicy\": \"false\"\n            }\n        )\n        \n        if success:\n            print(f\"  \u2705 {tag} conclu\u00eddo\")\n        else:\n            print(f\"  \u274c {tag} falhou\")\n```\n\n## \ud83c\udfaf Integra\u00e7\u00e3o Combinada\n\n### Pipeline de Dados Completo\n```python\nfrom mentorstec import EventHubClient, Dremio, PowerBi\nimport logging\n\n# Configurar logging\nlogging.basicConfig(level=logging.INFO)\n\n# Inicializar clientes\nevent_client = EventHubClient.create_azure_client(\"data-pipeline\", \"pipeline-events\", \"etl\")\ndremio = Dremio(host=\"dremio.company.com\", port=9047, username=\"etl_user\", password=\"password\")\npowerbi = PowerBi(\"powerbi_etl_config\")\n\ndef run_daily_pipeline():\n    \"\"\"Pipeline di\u00e1rio de dados completo\"\"\"\n    \n    try:\n        # 1. Log in\u00edcio do pipeline\n        event_client.send_event(\n            event_type=\"PIPELINE_START\",\n            message=\"Iniciando pipeline di\u00e1rio de dados\",\n            object=\"daily_etl\",\n            tags=[\"pipeline\", \"etl\", \"daily\"]\n        )\n        \n        # 2. Extrair dados do Dremio\n        if dremio.authenticate():\n            daily_metrics = dremio.execute_sql(\"\"\"\n                SELECT \n                    DATE(order_date) as date,\n                    SUM(revenue) as daily_revenue,\n                    COUNT(*) as daily_orders,\n                    COUNT(DISTINCT customer_id) as unique_customers\n                FROM sales.fact_orders \n                WHERE order_date >= CURRENT_DATE - INTERVAL '1' DAY\n                GROUP BY 1\n            \"\"\")\n            \n            if daily_metrics and daily_metrics.get('rows'):\n                event_client.send_event(\n                    event_type=\"DATA_EXTRACTED\",\n                    message=f\"Extra\u00eddos {len(daily_metrics['rows'])} registros do Dremio\",\n                    object=\"dremio_extraction\",\n                    tags=[\"success\", \"extraction\"]\n                )\n            else:\n                raise Exception(\"Nenhum dado encontrado no Dremio\")\n        else:\n            raise Exception(\"Falha na autentica\u00e7\u00e3o do Dremio\")\n        \n        # 3. Atualizar Power BI\n        projects = powerbi.get_power_bi_projects()\n        refresh_results = powerbi.refresh_all_datasets(projects)\n        \n        successful_refreshes = sum(1 for success in refresh_results.values() if success)\n        total_datasets = len(refresh_results)\n        \n        if successful_refreshes == total_datasets:\n            event_client.send_event(\n                event_type=\"POWERBI_SUCCESS\",\n                message=f\"Todos os {total_datasets} datasets atualizados com sucesso\",\n                object=\"powerbi_refresh\",\n                tags=[\"success\", \"powerbi\", \"complete\"]\n            )\n        else:\n            event_client.send_event(\n                event_type=\"POWERBI_PARTIAL\",\n                message=f\"{successful_refreshes}/{total_datasets} datasets atualizados\",\n                object=\"powerbi_refresh\",\n                tags=[\"warning\", \"powerbi\", \"partial\"],\n                failed_datasets=[tag for tag, success in refresh_results.items() if not success]\n            )\n        \n        # 4. Log sucesso do pipeline\n        event_client.send_event(\n            event_type=\"PIPELINE_SUCCESS\",\n            message=\"Pipeline di\u00e1rio conclu\u00eddo com sucesso\",\n            object=\"daily_etl\",\n            tags=[\"pipeline\", \"success\", \"complete\"],\n            datasets_processed=total_datasets,\n            records_extracted=len(daily_metrics.get('rows', []))\n        )\n        \n    except Exception as e:\n        # Log erro detalhado\n        event_client.send_error(\n            e,\n            context=\"daily_pipeline\",\n            tags=[\"pipeline\", \"failure\", \"critical\"],\n            pipeline_stage=\"data_extraction\"\n        )\n        raise\n\n# Executar pipeline\nif __name__ == \"__main__\":\n    run_daily_pipeline()\n```\n\n## \ud83d\udcca Estrutura do Payload\n\n### EventHub - Formato Padr\u00e3o\n```json\n{\n    \"project\": \"meu-projeto\",\n    \"layer\": \"web\",\n    \"queue_name\": \"events-queue\",\n    \"message\": \"Usu\u00e1rio fez login\",\n    \"obs\": \"\",\n    \"timestamp\": \"2025-01-11T10:30:45.123456Z\",\n    \"event_type\": \"USER_LOGIN\", \n    \"object\": \"auth_service\",\n    \"tags\": [\"auth\", \"success\", \"production\"],\n    \"user_id\": 12345,\n    \"session_id\": \"sess_abc123\"\n}\n```\n\n## \ud83e\uddea Testes e Qualidade\n\n```bash\n# Executar suite completa de testes\n./run_tests.sh\n\n# Apenas testes unit\u00e1rios\npytest tests/ -v --cov=mentorstec\n\n# Formata\u00e7\u00e3o e lint\nblack mentorstec/\nruff check mentorstec/\nmypy mentorstec/ --ignore-missing-imports\n```\n\n## \ud83d\ude80 Deploy e CI/CD\n\n### Deploy Autom\u00e1tico\nO projeto possui pipeline completa de CI/CD:\n\n- **Push/PR para `main`**: Executa testes e deploy autom\u00e1tico\n- **TestPyPI**: Deploy autom\u00e1tico em PRs\n- **PyPI**: Deploy autom\u00e1tico em merge para main\n\n### Configura\u00e7\u00e3o de Secrets\nConfigure no GitHub os tokens:\n- `PYPI_TOKEN`: Token do PyPI oficial\n- `PYPI_TEST_TOKEN`: Token do TestPyPI\n\n### Versionamento\nIncrementar vers\u00e3o em `mentorstec/__init__.py`:\n```python\n__version__ = \"0.1.4\"  # Pr\u00f3xima vers\u00e3o\n```\n\n## \ud83d\udd27 Extensibilidade\n\n### Adicionando Novos M\u00f3dulos\n```python\n# Exemplo: novo m\u00f3dulo para Snowflake\nfrom mentorstec import EventHubClient\n\nclass SnowflakeClient:\n    def __init__(self, connection_params):\n        self.event_client = EventHubClient.create_azure_client(\n            \"snowflake-ops\", \"snowflake-events\", \"data\"\n        )\n        # Sua implementa\u00e7\u00e3o aqui\n    \n    def execute_query(self, query):\n        try:\n            # L\u00f3gica de execu\u00e7\u00e3o\n            result = self._execute(query)\n            \n            self.event_client.send_event(\n                event_type=\"QUERY_SUCCESS\",\n                message=\"Query executada com sucesso\",\n                object=\"snowflake_query\",\n                tags=[\"snowflake\", \"success\"],\n                query_hash=hash(query),\n                row_count=len(result)\n            )\n            return result\n            \n        except Exception as e:\n            self.event_client.send_error(\n                e, context=\"snowflake_query\",\n                query=query[:100]  # Primeiros 100 chars\n            )\n            raise\n```\n\n## \ud83d\udcda Documenta\u00e7\u00e3o Completa\n\n- **Exemplos Avan\u00e7ados**: `/examples/powerbi_usage_examples.py`\n- **Guias de Deploy**: `/docs/DEPLOY_GUIDE.md`\n- **Guias de Teste**: `/docs/TESTING_GUIDE.md`\n- **Configura\u00e7\u00e3o de Secrets**: `/.github/SECRETS.md`\n\n## \ud83d\udcc4 Licen\u00e7a\n\nMIT License - veja [LICENSE](LICENSE) para detalhes.\n\n## \ud83e\udd1d Contribuindo\n\n1. Fork o projeto\n2. Crie sua feature branch (`git checkout -b feature/nova-feature`)\n3. Execute os testes (`./run_tests.sh`)\n4. Commit suas mudan\u00e7as (`git commit -m 'Add nova feature'`)\n5. Push para a branch (`git push origin feature/nova-feature`)\n6. Abra um Pull Request\n\n## \ud83d\udcde Suporte\n\n- **Email**: diego@mentorstec.com.br\n- **Issues**: [GitHub Issues](https://github.com/mentorstec/mentorstec/issues)\n- **Documenta\u00e7\u00e3o**: [GitHub Wiki](https://github.com/mentorstec/mentorstec/wiki)\n\n---\n\n**Mentorstec v0.1.3** - Plataforma completa de dados e logging com arquitetura modular \ud83d\ude80",
    "bugtrack_url": null,
    "license": null,
    "summary": "Mentorstec - centralized event logging and monitoring",
    "version": "1.0.2",
    "project_urls": {
        "Documentation": "https://github.com/Mentorstec/mentorstec#readme",
        "Homepage": "https://github.com/Mentorstec/mentorstec",
        "Issues": "https://github.com/Mentorstec/mentorstec/issues",
        "Repository": "https://github.com/Mentorstec/mentorstec"
    },
    "split_keywords": [
        "azure",
        " event-hub",
        " exceptions",
        " logging",
        " mentorstec",
        " monitoring"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "5a9e599bca98863396c6768f9b9ec8d496d8df68a7992950f0e98fc6dad1e436",
                "md5": "86de334e9f9b0174a7f167b2739657b9",
                "sha256": "55b26581f5a685d3e681dc665f81d5e3de917e1b9a1705fa3a0ec69c0ab454d0"
            },
            "downloads": -1,
            "filename": "mentorstec-1.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "86de334e9f9b0174a7f167b2739657b9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 23896,
            "upload_time": "2025-08-12T00:53:06",
            "upload_time_iso_8601": "2025-08-12T00:53:06.967561Z",
            "url": "https://files.pythonhosted.org/packages/5a/9e/599bca98863396c6768f9b9ec8d496d8df68a7992950f0e98fc6dad1e436/mentorstec-1.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "891a9a6e8903ab7be458a904acf83c427f562e2b39197ea7f4be65088bfaa1af",
                "md5": "1fe87b518602a3027cd3d84cf7478572",
                "sha256": "dda570522e61ef3350a1c4075bb68f02bed86cb0e9ea16cbf75ab4c4569f89f4"
            },
            "downloads": -1,
            "filename": "mentorstec-1.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "1fe87b518602a3027cd3d84cf7478572",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 42150,
            "upload_time": "2025-08-12T00:53:08",
            "upload_time_iso_8601": "2025-08-12T00:53:08.174033Z",
            "url": "https://files.pythonhosted.org/packages/89/1a/9a6e8903ab7be458a904acf83c427f562e2b39197ea7f4be65088bfaa1af/mentorstec-1.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-12 00:53:08",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Mentorstec",
    "github_project": "mentorstec#readme",
    "github_not_found": true,
    "lcname": "mentorstec"
}
        
Elapsed time: 1.52597s