aic-utils


Nameaic-utils JSON
Version 2.0.0 PyPI version JSON
download
home_pagehttps://github.com/dylandoyle11/aic_utils
SummaryAIC API wrapper and GitLab integration framework for pipeline management
upload_time2025-07-30 18:31:13
maintainerNone
docs_urlNone
authorDylan D
requires_python>=3.8
licenseNone
keywords aic gitlab pipeline automation devops
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # GitLabManager & AIC Integration Guide


## 1. Executive Overview

This guide introduces a repeatable, audit-friendly methodology for managing your AIC pipelines in tandem with GitLab, bridging the gap between your workspace configurations in AIC and your version-controlled artifacts in Git.

By leveraging the aic_utils package and the GitLabManager helper library, you gain end-to-end control over:
   - Authentication & discovery
   - Configuration extraction & storage
   - Automated branch and workspace mapping
   - Safe, MR-driven promotions through Sandbox → QA → Production
   - Embedded code snippet extraction for SQL and PySpark tasks

This integration ensures that every pipeline change—from minor SQL tweak to full-scale workflow overhaul—is captured as code, reviewed via merge requests, and traceable through Git history. It enforces a strict 1:1 relationship between Git branches and AIC workspaces, preventing accidental or out-of-sync deployments. Whether you’re onboarding a new environment, back-syncing an existing workspace, or building feature-specific sandbox branches, this approach provides the guardrails and automation you need to move fast without sacrificing governance or reliability.

In the sections that follow, you’ll find:

   - aic_utils Overview – Core classes, methods, and patterns for programmatic pipeline management.
   - GitLabManager Deep-Dive – Configuration, branch mapping, and code-sync mechanics.
   - Day-to-Day Workflows – Step-by-step guides for common tasks: syncing changes, feature work, releases, and hotfixes.
   - Advanced Scenarios – Bootstrapping new workspaces, force-sync operations, and backup strategies.
   - Best Practices & Troubleshooting – Tips to avoid pitfalls and quickly resolve common issues.



### Core GitLabManager functionality:

- **Storing every pipeline** as JSON and code snippets in Git.
   - Job metadata is stored in  `config` in JSON form
   - Code snippets are saved in the respective `code` folders for corporate code scanning
- **Enforcing** clear branch naming conventions and 1:1 branch ↔ workspace mappings.
- **Automating** creation of feature, release, and hotfix branches and enforcing proper git flow
- **Preventing** inadvertent cross‑environment deployments, disruption to production, or lose of code between environments.
- **Auditing** all changes via merge requests and commit history.

---

## 2. AIC SDK (aic_utils)

The `AIC` class wraps the AIC REST API, exposing five main capability areas:

1. **Authentication & Init**  
2. **Pipeline Discovery**  
3. **Configuration Fetching**  
4. **Pipeline Upsert (CRUD)**  
5. **Workspace Artifact Management**

---

### 2.1 Authentication & Init

AIC(api_key: str,
    project: str,
    workspace: str,
    pipelines: list[str] = [],
    qa: bool = False)

- **What it does**  
  - Chooses the PROD vs QA base-URL (qa flag).  
  - Sets headers (accept, api-key, Content-Type).  
  - Resolves and stores project_id and workspace_id.  
  - Loads all pipelines into .pipelines and, if requested, their configs into .pipeline_configs.

- **Attributes**  
  - self.base_url – endpoint root  
  - self.headers – auth + content headers  
  - self.project_id, self.workspace_id (strings)  
  - self.pipelines: list[dict{name: str, id: str}]  
  - self.pipeline_configs: list[dict{name: str, jobConfig: dict, id: str}]

---

### 2.2 Pipeline Discovery

.pop_pipelines() → list[{'name': str, 'id': str}]

- **Signature**  
  def pop_pipelines(self) -> list[{'name': str, 'id': str}]
- **What it does**  
  - GET /projects/{proj}/workspaces/{ws}/jobs  
  - Parses jobs array, extracts title as name and '$id' as id.
- **Returns**  
  - A list of {'name': pipelineName, 'id': pipelineId}.
- **Side effects**  
  - Updates self.pipelines and prints “Loaded N pipelines…”.

.pop_pipeline_config(pipelines: list[str]) → list[{'name', 'jobConfig', 'id'}]

- **Signature**  
  def pop_pipeline_config(self, pipelines: list[str]) -> list[{'name', 'jobConfig', 'id'}]
- **Parameters**  
  - pipelines: names to fetch, or ['*'] for all.
- **What it does**  
  - Filters self.pipelines by name.  
  - Uses a thread pool to call fetch_pipeline_config() in parallel.
- **Returns**  
  - A list of configs: { 'name':…, 'jobConfig': <JSON>, 'id': <jobConfig.$id> }.

---

### 2.3 Configuration Fetching

.fetch_pipeline_config(pipeline: dict, direct: bool=False) → dict

- **Signature**  
  def fetch_pipeline_config(self, pipeline: {'name': str, 'id' or '$id': str}, direct: bool = False) → {'name': str, 'jobConfig': dict, 'id': str}
- **Parameters**  
  - pipeline: one entry from self.pipelines.  
  - direct: if True, uses pipeline['$id'] directly.
- **What it does**  
  - GET /projects/{proj}/workspaces/{ws}/jobs/{id}  
  - Extracts jobConfig block and its own '$id'.
- **Returns**  
  - { 'name': pipelineName, 'jobConfig': <JSON>, 'id': <jobConfig.$id> }
- **Raises**  
  - KeyError if no matching pipeline is found.

---

### 2.4 Pipeline Upsert (Create/Update)

.write_config_to_pipeline(config: dict) → None

- **Signature**  
  def write_config_to_pipeline(self, config: {'name': str, 'jobConfig': dict, 'id'?: str}) → None
- **What it does**  
  - Builds payload with title, stages, variables, sourceStage, sinkStage.  
  - POST /projects/{proj}/workspaces/{ws}/jobs (creates or updates based on id).
- **Logs**  
  - “Created pipeline…” or “Updated pipeline…”
- **Error Handling**  
  - Catches HTTP errors, prints status and response text.

.create_pipeline_branch(config: dict) → Response

- **Signature**  
  def create_pipeline_branch(self, config: {'id': str, 'jobConfig': dict, 'name': str}) → Response
- **What it does**  
  - Generates PUSH_YYYYMMDDHHMMSS branch.  
  - PUT /interactive-pipeline/{jobId}/branches.

.create_or_update_pipeline(workspace_id: str, pipeline_config: dict) → None

- **Signature**  
  def create_or_update_pipeline(self, workspace_id: str, pipeline_config: {'name': str, 'jobConfig': dict, 'id'?: str}) → None
- **What it does**  
  - POST /jobs to create. On 409 Conflict, calls update_pipeline().

.update_pipeline(workspace_id: str, pipeline_id: str, pipeline_config: dict) → None

- **Signature**  
  def update_pipeline(self, workspace_id: str, pipeline_id: str, pipeline_config: {'name': str, 'jobConfig': dict}) → None
- **What it does**  
  - PUT /projects/{proj}/workspaces/{ws}/jobs/{id} to update.

---

### 2.5 Workspace Artifact Management

**Datasets & Tables**  
.get_datasets() → list[dict]  
  - GET /datasets → list of dataset objects.  
.get_tables(dataset_id: str) → list[dict]  
  - GET /datasets/{id}/tables → list of tables.

**Interactive Branches**  
.delete_branches(job_names: list[str]) → None  
  - Deletes interactive branches via GET + DELETE.

**Backups**  
.backup_pipelines(pipelines: list[Union[str, dict]], base_folder: str='.', drive_name: str='backups') → None  
  - Creates dated folder (YYYYMMDD); uploads JSON via POST /upload-file.  
.get_drive_id_by_name(drive_name: str) → Optional[str]

---

#### Under the Hood

- Concurrency: Uses ThreadPoolExecutor.  
- Error Handling: response.raise_for_status(), prints warnings.  
- Logging: print() statements for auditability.  
- Timestamp: AIC.timestamp = instantiation date (YYYYMMDD).  

---

## 3. GitLabManager Overview

The `GitLabManager` class uses your AIC instance and the GitLab API to synchronize pipeline definitions, enforce branch/workspace governance, and automate promotions.

---

### 3.1 Initialization & Setup

GitLabManager(
    aic_instance: AIC,
    gitlab_token: str,
    gitlab_namespace: str,
    repo_folder: str,
    gitlab_base_url: str = "https://git.autodatacorp.org/api/v4",
    use_hash_comparison: bool = True,
    email_recipients: list = None,
    email_sender: str = None,
    mapping_file: str = "branch_to_workspace.yaml"
)

- **What it does**  
  - Stores references to AIC, GitLab endpoints, and auth tokens.  
  - Resolves `project_path`, retrieves `default_branch`, and loads branch→workspace mapping.  
  - Determines current branch (`self.branch`) from AIC workspace and ensures it exists.  
  - Configures optional email summary settings.

- **Key attributes**  
  - `self.aic`, `self.gitlab_base`, `self.gitlab_token`, `self.headers`  
  - `self.project_path`, `self.default_branch`, `self.branch`, `self.mapper`  
  - `self.email_recipients`, `self.email_sender`, `self.use_hash_comparison`

---

### 3.2 Repository & Branch Utilities

.repository_exists(repo_path: str) → bool  
  - **Checks**: `GET /projects/{repo_path}`  
  - **Returns**: True if HTTP 200, else False.

.create_repository(repo_name: str) → dict or None  
  - **Creates**: `POST /projects` under specified subgroup.  
  - **Returns**: Project JSON on success, else None.

.get_subgroup_id() → int  
  - **Fetches**: `GET /groups/{namespace}` to find subgroup ID.

.ensure_branch_exists(branch_name: str, ref: str) → None  
  - **Checks**: `GET /repository/branches/{branch_name}`  
  - **Creates**: If 404, `POST /repository/branches?branch={branch_name}&ref={ref}`  
  - **Logs**: Prints status of creation or existence.

._slugify(text: str) → str  
  - **Converts**: Arbitrary text to a URL-safe slug.

.check_token_expiration(warning_threshold_days: int = 30) → None  
  - **Retrieves**: Personal access tokens via `GET /personal_access_tokens`  
  - **Calculates**: Days until expiry, prints warnings if ≤ threshold.

---

### 3.3 File Operations

.get_existing_file_content(repo_name: str, file_name: str) → Optional[str]  
  - **Fetches**: Base64-encoded file via `GET /repository/files/{file_name}` on `self.branch` or `default_branch`.

.fetch_pipeline_file(full_path: str, ref: str = None) → str  
  - **Gets**: Raw content via `GET /repository/files/{full_path}/raw?ref={ref}`  
  - **Falls back**: to current branch then default branch.

.list_pipeline_files(subpath: str = None) → list[str]  
  - **Paginates**: `GET /repository/tree` (per_page=100), filters for `.json` files.

.list_all_blobs(subpath: str) → list[str]  
  - **Recursively lists**: All blobs under `subpath` via `GET /repository/tree?recursive=true`.

.delete_file(file_name: str, branch: str = None) → None  
  - **Deletes**: `DELETE /repository/files/{file_name}?branch={branch}&commit_message=...`

._commit_deletion_actions(paths_to_delete: list[str], branch: str, commit_msg_prefix: str) → None  
  - **Batches**: Up to 50 deletions per commit via `POST /repository/commits` with `actions=[{action: 'delete', file_path: ...}]`.

---

### 3.4 Pushing to Git

._push_file(proj_encoded: str, file_name: str, file_content: str, branch: str) → None  
  - **Creates**: `POST /repository/files/{file_name}` for new files.  
  - **Updates**: `PUT /repository/files/{file_name}` on 400/409 responses.

._extract_and_push_code(pipeline_name: str, job_config: dict) → None  
  - **Scans**: Recursively finds tasks of type `PYSPARK` or `SQL`.  
  - **Pushes**: Full `.py` scripts and extracted SQL blocks as `.sql` files.

.push_to_git() → None  
  - **Validates**: Current branch against slug, mapping, or feature pattern.  
  - **Pass 1**: Iterates `self.aic.pipeline_configs`, pushes JSON into `config/{name}.json`.  
  - **Pass 2**: Calls `_extract_and_push_code` to sync code under `code/{pipeline}`.  
  - **Email**: Summarizes results via `_format_push_summary` and `_send_email`.

._format_push_summary(branch: str, results: list[tuple]) → (subject: str, body: str)  
  - **Builds**: HTML email table of `(pipeline, status, message)` rows.

---

### 3.5 Deploying Pipelines

.deploy_pipelines(force_sync: bool = False) → None  
  - **Validates**: Branch→workspace via `self.mapper`.  
  - **Enforces**: Allowed branch patterns (slug, default, release/, feature/).  
  - **Force-sync**: If True, reads every `config/*.json` and upserts into AIC.  
  - **Hash-sync**: Else, compares old vs new JSON hashes, updates only changed pipelines.  
  - **Email**: Sends deployment summary via `_format_deploy_summary`.

---

### 3.6 Automated Branch Creation

.create_feature_branch(feature_name: str) → None  
  - **Requires**: Current branch is `sandbox`.  
  - **Creates**: `feature/{feature_name}` off `sandbox`, switches `self.branch`.

.create_release_branch(branch_name: str) → None  
  - **Requires**: Current branch is `develop`.  
  - **Creates**: `release/{branch_name}`, deletes any `config/` or `code/` files not in `self.aic.pipeline_configs`.

.create_hotfix_branch(branch_name: str) → None  
  - **Determines**: Production branch via mapper or default.  
  - **Creates**: `hotfix/{branch_name}` off prod, deletes pipelines not allowed.

---

### 3.7 Branch↔Workspace Mapping

BranchWorkspaceMapper(mapping_file: str = "branch_to_workspace.yaml")

- **Loads**: YAML mapping from `default_branch`.  
- **raw_mapping**: Dict of exact branch→workspace.  
- **_prefix_patterns**: Compiled regex for wildcard rules.

.lookup(branch: str) → str  
- **Exact match** in `raw_mapping`.  
- **Prefix match** for entries ending `/*`.  
- **Built-in**: `feature/` → develop, `release/` & `hotfix/` → main.

.workspace_to_branch(workspace: str) → str  
- **Reverse lookup** for a mapping key matching given workspace.  
- **Fallback slug** if none found.

---

### Under the Hood

- **Concurrency**: ThreadPoolExecutor used only in AIC, not here.  
- **Error Handling**: `response.raise_for_status()`, catch HTTPError around critical calls.  
- **Hashing**: Uses MD5 normalization to detect JSON changes.  
- **Logging**: Verbose `print()` statements for visibility during operations.  


---

## 4. User Guide

### 4.1 Repository Creation & Default Branch

1. **Create the GitLab project** under your namespace (e.g. `pin/pin-analytics/pin-fusion-2.0`).
2. **Set `main` as the default and protected branch** in _Settings → Repository → Branches_. This serves as the **Prod** branch in our mapping.
3. **Add a `branch_to_workspace.yaml`** on `main` to create a 1:1 mapping of desired workspace to designated git branches

```yaml
mapping:
  main:      PIN FUSION 2.0
  develop:   PIN FUSION 2.0 QA
  sandbox:   PIN FUSION 2.0 DEV

```
- **`mapping`**: exact branch names.

### 4.2 Preparing AIC() object
To leverage this GitLabManager, a proper AIC object needs to be created that will encapsulate the workspace in question as well as the pipelines required in the git operations. 

To obtain AIC API key, log into AIC and in the top right corner, navigate to ```User Profile -> Manage API Keys```

```python 
aic_prod = AIC(
    api_key   = "<API-KEY>",
    project   = "dylan.doyle@jdpa.com",
    workspace = "PROD TEST", 
    pipelines = ['test_job']
)
```
The correct branch will be selected or created based on the workspace passed and the branch_to_workspace.yaml file. If the file is not found or workspace is not mapped, the branch will default to a formatted string from the workspace name,

**NOTE**: Operations will be completely limited to this workspace and only the pipelines passed. Add more pipelines as desired, or pass ['*'] to load ALL pipelines. 

### 4.3 Preparing GitLabManager() object
Create a GitLabManager instance and pass the previously created AIC object so that GitLab API can speak directly to the objects in the AIC workspace.

```python
mgr = GitLabManager(
    aic_instance     = aic_prod,
    gitlab_token     = "<GIT_TOKEN>",
    gitlab_namespace = "pin/pin-analytics",
    repo_folder     = "aic-git-integration"
)
```

**TIP:** Multiple instances can be created for multiple workspaces (i.e for both Prod and QA instances). Each AIC workspace must have its own AIC instance and respective GitLabManager instance.

### 4.4 Syncing AIC code to Git

```python
mgr.push_to_git()
```

- **When**:
  - Initial branch/workspace git loads
  - When work is performed on `develop` or `sandbox` branches
  - Prior to cutting a `release` or `feature` branch
  - After creation of hotfix branch to synch changes in production workspace
- **What happens**:
  - Respective ```config/<pipeline>.json``` files are upserted via the GitLab API based on the `AIC.pipelines` list that was passed earlier.
  - Embedded PySpark/SQL code is extracted and pushed under ```code/<pipeline>/```.

- **Restrictions**:
  - Cannot push_git to `main` branch unless `force_push=True` is passed. This is to restrict production branches to release and hotfix merges primarily.
  - Cannot push_git to a branch corresponding to a different workspace
  - Cannot push_git to feature or release branches; these branches must be cut from the respective source environments which should contain updates in question.

---

### 4.5 Sandbox → Develop Push


1. **Create `sandbox` objects**
   ```python
   aic_sandbox = AIC(workspace='sandbox_workspace'...)
   mgr = GitLabManager(aic_instance=aic_sandbox, ...)
   ```
2. **Sync changes to `sandbox`**

   ```python
   mgr.push_to_git() # push current AIC state so sandbox branch captures the changes
   ```
3. **Cut a feature branch**

   ```python
   mgr.create_feature_branch("PINOPS-1831")     # sets mgr.branch to feature branch
   ```
4. **Create merge request and resolve**

   ```python
   mgr.create_merge_request(title="Demographics Module", description='Push for QA Testing', target_branch='develop')
   ```

5. **Deploy**

   ```python
   mgr.deploy_pipelines()                     # deploys merged pipelines to develop workspace
   ```
- **When**: Releasing a feature/code from `sandbox` or other similar environments. This is not a feature release to production.
- **What happens**:
  - Creates a new branch ```feature/PINOPS-1831``` off ```sandbox```.
  - Pushes all current ```config/``` and ```code/``` for the specified pipelines into that branch.
- ** Restrictions **:
   - Cannot open a feature branch from `main` or `develop` branches; feature branch creation is only for promoting code from sandbox-like environments. 
---

### 4.6 QA → Prod Release

1. **Create `develop` objects**
   ```python
   aic_qa = AIC(workspace='develop_workspace'...)
   mgr = GitLabManager(aic_instance=aic_qa, ...)
   ```
2. **Prepare QA**

   ```python
   # Starting from develop branch/workspace, ensure AIC has updated code
   mgr.push_to_git() # sync develop environment to capture new updates
   ```

3. **Cut Release Branch**

   ```python
   mgr.create_release_branch("PINOPS-1832")       # sets mgr.branch to 'release/v2.1.0'
   ```
   - Prunes any pipelines not in ```mgr.aic.pipeline_configs```.

4. **Create Merge Request and Resolve**

   ```python
   mgr.create_merge_request(
     title="Release v2.1.0",
     description="QA-approved changes for selected pipelines"
   )
   ```
   - Opens MR from ```release/v2.1.0``` → ```main```.

5. **Deploy**

   ```python
   mgr.deploy_pipelines()  # deploys merged pipelines to production workspace
   ```

- **When**: Releasing updates from develop to main
- **What happens**:
  - Creates a new branch ```release/PINOPS-1831``` off ```develop```.
  - Pushes all current ```config/``` and ```code/``` for the specified pipelines into that branch.
- ** Restrictions **:
   

---

### 4.7 Production Hotfix

1. **Diagnose & Plan**
   - Identify the pipeline(s) requiring an urgent fix.

2. **Create `main` objects**
   ```python
   aic_prod = AIC(workspace='prod_workspace', pipelines=['<PIPELINE_FROM_STEP1>'], ...)
   mgr = GitLabManager(aic_instance=aic_prod, ...)
   ```
   
2. **Create Hotfix Branch**

   ```python
   mgr.create_hotfix_branch("PINOPS-1831")
   # At this point, AIC prod environment should have the fix in place. 
   mgr.push_to_git() # Sync the updated code to the hotfix branch and NOT the production branch
   ```
   - Creates ```hotfix/PINOPS-1831``` off ```main``` and pushes the fix to hotfix branch.

3. **Prepare Hotfix branch**

   ```python
   # Starting from develop branch/workspace, ensure AIC has updated code
   mgr.push_to_git() # sync hotfix branch to current prod workspace
   ```
   **NOTE** The workflow for a Hotfix is the opposite of feature/release branches. Because the change will be made in the AIC production workspace prior, the `hotfix` branch will be updatec with `push_to_git()` while the `main` branch on the repository remains 1 commit behind. This is to ensure that fixes exclusively in production can be captured by a merge request and appropriately audited. 

4. **Open & Merge MR**

   ```python
   mgr.create_merge_request(
     title="Hotfix PINOPS-1831",
     description="Emergency fix: correct streaming timeout",
     target_branch="main"
   )
   ```

5. **Deploy & Back-merge**

   ## REVISE MEANS OF BACK-MERGING HOTFIXES

- **When**: Releasing immediate hotfixes straight from production
- **What happens**:
  - Creates a new branch ```hotfix/PINOPS-1831``` off ```develop```. 
    **The production branch will not contain the hotfix at this point.**
   - Hotfix branch gets updated with the new code additions
  - Hotfix gets merged back into production branch to synch branch to AIC workspace and contains audit trail and roll back function
- ** Restrictions **:
   - Hotfixes can only be created from `main` branch
---

## 6. Key Principles & Best Practices

1. **1:1 Branch↔Workspace**: Enforced via `branch_to_workspace.yaml`. No cross‑territory deployments.
2. **MR‑only Promotions**: `main`, `develop`, `sandbox` are protected; all changes flow via `feature/*`, `release/*`, or `hotfix/*`.
3. **Audit Trail**: Every pipeline change lives in Git; interactive branches on AIC mirror the same JSON.
4. **No CLI**: End users only call Python methods; they never need to know Git commands.
5. **Error Handling**: Guards in code refuse unsafe operations (wrong branch, unmapped workspace).

---

## 7. Further Reading & Support

- **Contact**: Dylan Doyle.

# GitLabManager & AIC Integration Guide



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/dylandoyle11/aic_utils",
    "name": "aic-utils",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "aic gitlab pipeline automation devops",
    "author": "Dylan D",
    "author_email": "dylan.doyle@jdpa.com",
    "download_url": "https://files.pythonhosted.org/packages/3a/29/d235c19673470fbcaa7213dda6cb4f32e599103568530be3153a9ebba9d9/aic_utils-2.0.0.tar.gz",
    "platform": null,
    "description": "# GitLabManager & AIC Integration Guide\n\n\n## 1. Executive Overview\n\nThis guide introduces a repeatable, audit-friendly methodology for managing your AIC pipelines in tandem with GitLab, bridging the gap between your workspace configurations in AIC and your version-controlled artifacts in Git.\n\nBy leveraging the aic_utils package and the GitLabManager helper library, you gain end-to-end control over:\n   - Authentication & discovery\n   - Configuration extraction & storage\n   - Automated branch and workspace mapping\n   - Safe, MR-driven promotions through Sandbox \u2192 QA \u2192 Production\n   - Embedded code snippet extraction for SQL and PySpark tasks\n\nThis integration ensures that every pipeline change\u2014from minor SQL tweak to full-scale workflow overhaul\u2014is captured as code, reviewed via merge requests, and traceable through Git history. It enforces a strict 1:1 relationship between Git branches and AIC workspaces, preventing accidental or out-of-sync deployments. Whether you\u2019re onboarding a new environment, back-syncing an existing workspace, or building feature-specific sandbox branches, this approach provides the guardrails and automation you need to move fast without sacrificing governance or reliability.\n\nIn the sections that follow, you\u2019ll find:\n\n   - aic_utils Overview \u2013 Core classes, methods, and patterns for programmatic pipeline management.\n   - GitLabManager Deep-Dive \u2013 Configuration, branch mapping, and code-sync mechanics.\n   - Day-to-Day Workflows \u2013 Step-by-step guides for common tasks: syncing changes, feature work, releases, and hotfixes.\n   - Advanced Scenarios \u2013 Bootstrapping new workspaces, force-sync operations, and backup strategies.\n   - Best Practices & Troubleshooting \u2013 Tips to avoid pitfalls and quickly resolve common issues.\n\n\n\n### Core GitLabManager functionality:\n\n- **Storing every pipeline** as JSON and code snippets in Git.\n   - Job metadata is stored in  `config` in JSON form\n   - Code snippets are saved in the respective `code` folders for corporate code scanning\n- **Enforcing** clear branch naming conventions and 1:1 branch \u2194 workspace mappings.\n- **Automating** creation of feature, release, and hotfix branches and enforcing proper git flow\n- **Preventing** inadvertent cross\u2011environment deployments, disruption to production, or lose of code between environments.\n- **Auditing** all changes via merge requests and commit history.\n\n---\n\n## 2. AIC SDK (aic_utils)\n\nThe `AIC` class wraps the AIC REST API, exposing five main capability areas:\n\n1. **Authentication & Init**  \n2. **Pipeline Discovery**  \n3. **Configuration Fetching**  \n4. **Pipeline Upsert (CRUD)**  \n5. **Workspace Artifact Management**\n\n---\n\n### 2.1 Authentication & Init\n\nAIC(api_key: str,\n    project: str,\n    workspace: str,\n    pipelines: list[str] = [],\n    qa: bool = False)\n\n- **What it does**  \n  - Chooses the PROD vs QA base-URL (qa flag).  \n  - Sets headers (accept, api-key, Content-Type).  \n  - Resolves and stores project_id and workspace_id.  \n  - Loads all pipelines into .pipelines and, if requested, their configs into .pipeline_configs.\n\n- **Attributes**  \n  - self.base_url \u2013 endpoint root  \n  - self.headers \u2013 auth + content headers  \n  - self.project_id, self.workspace_id (strings)  \n  - self.pipelines: list[dict{name: str, id: str}]  \n  - self.pipeline_configs: list[dict{name: str, jobConfig: dict, id: str}]\n\n---\n\n### 2.2 Pipeline Discovery\n\n.pop_pipelines() \u2192 list[{'name': str, 'id': str}]\n\n- **Signature**  \n  def pop_pipelines(self) -> list[{'name': str, 'id': str}]\n- **What it does**  \n  - GET /projects/{proj}/workspaces/{ws}/jobs  \n  - Parses jobs array, extracts title as name and '$id' as id.\n- **Returns**  \n  - A list of {'name': pipelineName, 'id': pipelineId}.\n- **Side effects**  \n  - Updates self.pipelines and prints \u201cLoaded N pipelines\u2026\u201d.\n\n.pop_pipeline_config(pipelines: list[str]) \u2192 list[{'name', 'jobConfig', 'id'}]\n\n- **Signature**  \n  def pop_pipeline_config(self, pipelines: list[str]) -> list[{'name', 'jobConfig', 'id'}]\n- **Parameters**  \n  - pipelines: names to fetch, or ['*'] for all.\n- **What it does**  \n  - Filters self.pipelines by name.  \n  - Uses a thread pool to call fetch_pipeline_config() in parallel.\n- **Returns**  \n  - A list of configs: { 'name':\u2026, 'jobConfig': <JSON>, 'id': <jobConfig.$id> }.\n\n---\n\n### 2.3 Configuration Fetching\n\n.fetch_pipeline_config(pipeline: dict, direct: bool=False) \u2192 dict\n\n- **Signature**  \n  def fetch_pipeline_config(self, pipeline: {'name': str, 'id' or '$id': str}, direct: bool = False) \u2192 {'name': str, 'jobConfig': dict, 'id': str}\n- **Parameters**  \n  - pipeline: one entry from self.pipelines.  \n  - direct: if True, uses pipeline['$id'] directly.\n- **What it does**  \n  - GET /projects/{proj}/workspaces/{ws}/jobs/{id}  \n  - Extracts jobConfig block and its own '$id'.\n- **Returns**  \n  - { 'name': pipelineName, 'jobConfig': <JSON>, 'id': <jobConfig.$id> }\n- **Raises**  \n  - KeyError if no matching pipeline is found.\n\n---\n\n### 2.4 Pipeline Upsert (Create/Update)\n\n.write_config_to_pipeline(config: dict) \u2192 None\n\n- **Signature**  \n  def write_config_to_pipeline(self, config: {'name': str, 'jobConfig': dict, 'id'?: str}) \u2192 None\n- **What it does**  \n  - Builds payload with title, stages, variables, sourceStage, sinkStage.  \n  - POST /projects/{proj}/workspaces/{ws}/jobs (creates or updates based on id).\n- **Logs**  \n  - \u201cCreated pipeline\u2026\u201d or \u201cUpdated pipeline\u2026\u201d\n- **Error Handling**  \n  - Catches HTTP errors, prints status and response text.\n\n.create_pipeline_branch(config: dict) \u2192 Response\n\n- **Signature**  \n  def create_pipeline_branch(self, config: {'id': str, 'jobConfig': dict, 'name': str}) \u2192 Response\n- **What it does**  \n  - Generates PUSH_YYYYMMDDHHMMSS branch.  \n  - PUT /interactive-pipeline/{jobId}/branches.\n\n.create_or_update_pipeline(workspace_id: str, pipeline_config: dict) \u2192 None\n\n- **Signature**  \n  def create_or_update_pipeline(self, workspace_id: str, pipeline_config: {'name': str, 'jobConfig': dict, 'id'?: str}) \u2192 None\n- **What it does**  \n  - POST /jobs to create. On 409 Conflict, calls update_pipeline().\n\n.update_pipeline(workspace_id: str, pipeline_id: str, pipeline_config: dict) \u2192 None\n\n- **Signature**  \n  def update_pipeline(self, workspace_id: str, pipeline_id: str, pipeline_config: {'name': str, 'jobConfig': dict}) \u2192 None\n- **What it does**  \n  - PUT /projects/{proj}/workspaces/{ws}/jobs/{id} to update.\n\n---\n\n### 2.5 Workspace Artifact Management\n\n**Datasets & Tables**  \n.get_datasets() \u2192 list[dict]  \n  - GET /datasets \u2192 list of dataset objects.  \n.get_tables(dataset_id: str) \u2192 list[dict]  \n  - GET /datasets/{id}/tables \u2192 list of tables.\n\n**Interactive Branches**  \n.delete_branches(job_names: list[str]) \u2192 None  \n  - Deletes interactive branches via GET + DELETE.\n\n**Backups**  \n.backup_pipelines(pipelines: list[Union[str, dict]], base_folder: str='.', drive_name: str='backups') \u2192 None  \n  - Creates dated folder (YYYYMMDD); uploads JSON via POST /upload-file.  \n.get_drive_id_by_name(drive_name: str) \u2192 Optional[str]\n\n---\n\n#### Under the Hood\n\n- Concurrency: Uses ThreadPoolExecutor.  \n- Error Handling: response.raise_for_status(), prints warnings.  \n- Logging: print() statements for auditability.  \n- Timestamp: AIC.timestamp = instantiation date (YYYYMMDD).  \n\n---\n\n## 3. GitLabManager Overview\n\nThe `GitLabManager` class uses your AIC instance and the GitLab API to synchronize pipeline definitions, enforce branch/workspace governance, and automate promotions.\n\n---\n\n### 3.1 Initialization & Setup\n\nGitLabManager(\n    aic_instance: AIC,\n    gitlab_token: str,\n    gitlab_namespace: str,\n    repo_folder: str,\n    gitlab_base_url: str = \"https://git.autodatacorp.org/api/v4\",\n    use_hash_comparison: bool = True,\n    email_recipients: list = None,\n    email_sender: str = None,\n    mapping_file: str = \"branch_to_workspace.yaml\"\n)\n\n- **What it does**  \n  - Stores references to AIC, GitLab endpoints, and auth tokens.  \n  - Resolves `project_path`, retrieves `default_branch`, and loads branch\u2192workspace mapping.  \n  - Determines current branch (`self.branch`) from AIC workspace and ensures it exists.  \n  - Configures optional email summary settings.\n\n- **Key attributes**  \n  - `self.aic`, `self.gitlab_base`, `self.gitlab_token`, `self.headers`  \n  - `self.project_path`, `self.default_branch`, `self.branch`, `self.mapper`  \n  - `self.email_recipients`, `self.email_sender`, `self.use_hash_comparison`\n\n---\n\n### 3.2 Repository & Branch Utilities\n\n.repository_exists(repo_path: str) \u2192 bool  \n  - **Checks**: `GET /projects/{repo_path}`  \n  - **Returns**: True if HTTP 200, else False.\n\n.create_repository(repo_name: str) \u2192 dict or None  \n  - **Creates**: `POST /projects` under specified subgroup.  \n  - **Returns**: Project JSON on success, else None.\n\n.get_subgroup_id() \u2192 int  \n  - **Fetches**: `GET /groups/{namespace}` to find subgroup ID.\n\n.ensure_branch_exists(branch_name: str, ref: str) \u2192 None  \n  - **Checks**: `GET /repository/branches/{branch_name}`  \n  - **Creates**: If 404, `POST /repository/branches?branch={branch_name}&ref={ref}`  \n  - **Logs**: Prints status of creation or existence.\n\n._slugify(text: str) \u2192 str  \n  - **Converts**: Arbitrary text to a URL-safe slug.\n\n.check_token_expiration(warning_threshold_days: int = 30) \u2192 None  \n  - **Retrieves**: Personal access tokens via `GET /personal_access_tokens`  \n  - **Calculates**: Days until expiry, prints warnings if \u2264 threshold.\n\n---\n\n### 3.3 File Operations\n\n.get_existing_file_content(repo_name: str, file_name: str) \u2192 Optional[str]  \n  - **Fetches**: Base64-encoded file via `GET /repository/files/{file_name}` on `self.branch` or `default_branch`.\n\n.fetch_pipeline_file(full_path: str, ref: str = None) \u2192 str  \n  - **Gets**: Raw content via `GET /repository/files/{full_path}/raw?ref={ref}`  \n  - **Falls back**: to current branch then default branch.\n\n.list_pipeline_files(subpath: str = None) \u2192 list[str]  \n  - **Paginates**: `GET /repository/tree` (per_page=100), filters for `.json` files.\n\n.list_all_blobs(subpath: str) \u2192 list[str]  \n  - **Recursively lists**: All blobs under `subpath` via `GET /repository/tree?recursive=true`.\n\n.delete_file(file_name: str, branch: str = None) \u2192 None  \n  - **Deletes**: `DELETE /repository/files/{file_name}?branch={branch}&commit_message=...`\n\n._commit_deletion_actions(paths_to_delete: list[str], branch: str, commit_msg_prefix: str) \u2192 None  \n  - **Batches**: Up to 50 deletions per commit via `POST /repository/commits` with `actions=[{action: 'delete', file_path: ...}]`.\n\n---\n\n### 3.4 Pushing to Git\n\n._push_file(proj_encoded: str, file_name: str, file_content: str, branch: str) \u2192 None  \n  - **Creates**: `POST /repository/files/{file_name}` for new files.  \n  - **Updates**: `PUT /repository/files/{file_name}` on 400/409 responses.\n\n._extract_and_push_code(pipeline_name: str, job_config: dict) \u2192 None  \n  - **Scans**: Recursively finds tasks of type `PYSPARK` or `SQL`.  \n  - **Pushes**: Full `.py` scripts and extracted SQL blocks as `.sql` files.\n\n.push_to_git() \u2192 None  \n  - **Validates**: Current branch against slug, mapping, or feature pattern.  \n  - **Pass 1**: Iterates `self.aic.pipeline_configs`, pushes JSON into `config/{name}.json`.  \n  - **Pass 2**: Calls `_extract_and_push_code` to sync code under `code/{pipeline}`.  \n  - **Email**: Summarizes results via `_format_push_summary` and `_send_email`.\n\n._format_push_summary(branch: str, results: list[tuple]) \u2192 (subject: str, body: str)  \n  - **Builds**: HTML email table of `(pipeline, status, message)` rows.\n\n---\n\n### 3.5 Deploying Pipelines\n\n.deploy_pipelines(force_sync: bool = False) \u2192 None  \n  - **Validates**: Branch\u2192workspace via `self.mapper`.  \n  - **Enforces**: Allowed branch patterns (slug, default, release/, feature/).  \n  - **Force-sync**: If True, reads every `config/*.json` and upserts into AIC.  \n  - **Hash-sync**: Else, compares old vs new JSON hashes, updates only changed pipelines.  \n  - **Email**: Sends deployment summary via `_format_deploy_summary`.\n\n---\n\n### 3.6 Automated Branch Creation\n\n.create_feature_branch(feature_name: str) \u2192 None  \n  - **Requires**: Current branch is `sandbox`.  \n  - **Creates**: `feature/{feature_name}` off `sandbox`, switches `self.branch`.\n\n.create_release_branch(branch_name: str) \u2192 None  \n  - **Requires**: Current branch is `develop`.  \n  - **Creates**: `release/{branch_name}`, deletes any `config/` or `code/` files not in `self.aic.pipeline_configs`.\n\n.create_hotfix_branch(branch_name: str) \u2192 None  \n  - **Determines**: Production branch via mapper or default.  \n  - **Creates**: `hotfix/{branch_name}` off prod, deletes pipelines not allowed.\n\n---\n\n### 3.7 Branch\u2194Workspace Mapping\n\nBranchWorkspaceMapper(mapping_file: str = \"branch_to_workspace.yaml\")\n\n- **Loads**: YAML mapping from `default_branch`.  \n- **raw_mapping**: Dict of exact branch\u2192workspace.  \n- **_prefix_patterns**: Compiled regex for wildcard rules.\n\n.lookup(branch: str) \u2192 str  \n- **Exact match** in `raw_mapping`.  \n- **Prefix match** for entries ending `/*`.  \n- **Built-in**: `feature/` \u2192 develop, `release/` & `hotfix/` \u2192 main.\n\n.workspace_to_branch(workspace: str) \u2192 str  \n- **Reverse lookup** for a mapping key matching given workspace.  \n- **Fallback slug** if none found.\n\n---\n\n### Under the Hood\n\n- **Concurrency**: ThreadPoolExecutor used only in AIC, not here.  \n- **Error Handling**: `response.raise_for_status()`, catch HTTPError around critical calls.  \n- **Hashing**: Uses MD5 normalization to detect JSON changes.  \n- **Logging**: Verbose `print()` statements for visibility during operations.  \n\n\n---\n\n## 4. User Guide\n\n### 4.1 Repository Creation & Default Branch\n\n1. **Create the GitLab project** under your namespace (e.g. `pin/pin-analytics/pin-fusion-2.0`).\n2. **Set `main` as the default and protected branch** in _Settings \u2192 Repository \u2192 Branches_. This serves as the **Prod** branch in our mapping.\n3. **Add a `branch_to_workspace.yaml`** on `main` to create a 1:1 mapping of desired workspace to designated git branches\n\n```yaml\nmapping:\n  main:      PIN FUSION 2.0\n  develop:   PIN FUSION 2.0 QA\n  sandbox:   PIN FUSION 2.0 DEV\n\n```\n- **`mapping`**: exact branch names.\n\n### 4.2 Preparing AIC() object\nTo leverage this GitLabManager, a proper AIC object needs to be created that will encapsulate the workspace in question as well as the pipelines required in the git operations. \n\nTo obtain AIC API key, log into AIC and in the top right corner, navigate to ```User Profile -> Manage API Keys```\n\n```python \naic_prod = AIC(\n    api_key   = \"<API-KEY>\",\n    project   = \"dylan.doyle@jdpa.com\",\n    workspace = \"PROD TEST\", \n    pipelines = ['test_job']\n)\n```\nThe correct branch will be selected or created based on the workspace passed and the branch_to_workspace.yaml file. If the file is not found or workspace is not mapped, the branch will default to a formatted string from the workspace name,\n\n**NOTE**: Operations will be completely limited to this workspace and only the pipelines passed. Add more pipelines as desired, or pass ['*'] to load ALL pipelines. \n\n### 4.3 Preparing GitLabManager() object\nCreate a GitLabManager instance and pass the previously created AIC object so that GitLab API can speak directly to the objects in the AIC workspace.\n\n```python\nmgr = GitLabManager(\n    aic_instance     = aic_prod,\n    gitlab_token     = \"<GIT_TOKEN>\",\n    gitlab_namespace = \"pin/pin-analytics\",\n    repo_folder     = \"aic-git-integration\"\n)\n```\n\n**TIP:** Multiple instances can be created for multiple workspaces (i.e for both Prod and QA instances). Each AIC workspace must have its own AIC instance and respective GitLabManager instance.\n\n### 4.4 Syncing AIC code to Git\n\n```python\nmgr.push_to_git()\n```\n\n- **When**:\n  - Initial branch/workspace git loads\n  - When work is performed on `develop` or `sandbox` branches\n  - Prior to cutting a `release` or `feature` branch\n  - After creation of hotfix branch to synch changes in production workspace\n- **What happens**:\n  - Respective ```config/<pipeline>.json``` files are upserted via the GitLab API based on the `AIC.pipelines` list that was passed earlier.\n  - Embedded PySpark/SQL code is extracted and pushed under ```code/<pipeline>/```.\n\n- **Restrictions**:\n  - Cannot push_git to `main` branch unless `force_push=True` is passed. This is to restrict production branches to release and hotfix merges primarily.\n  - Cannot push_git to a branch corresponding to a different workspace\n  - Cannot push_git to feature or release branches; these branches must be cut from the respective source environments which should contain updates in question.\n\n---\n\n### 4.5 Sandbox \u2192 Develop Push\n\n\n1. **Create `sandbox` objects**\n   ```python\n   aic_sandbox = AIC(workspace='sandbox_workspace'...)\n   mgr = GitLabManager(aic_instance=aic_sandbox, ...)\n   ```\n2. **Sync changes to `sandbox`**\n\n   ```python\n   mgr.push_to_git() # push current AIC state so sandbox branch captures the changes\n   ```\n3. **Cut a feature branch**\n\n   ```python\n   mgr.create_feature_branch(\"PINOPS-1831\")     # sets mgr.branch to feature branch\n   ```\n4. **Create merge request and resolve**\n\n   ```python\n   mgr.create_merge_request(title=\"Demographics Module\", description='Push for QA Testing', target_branch='develop')\n   ```\n\n5. **Deploy**\n\n   ```python\n   mgr.deploy_pipelines()                     # deploys merged pipelines to develop workspace\n   ```\n- **When**: Releasing a feature/code from `sandbox` or other similar environments. This is not a feature release to production.\n- **What happens**:\n  - Creates a new branch ```feature/PINOPS-1831``` off ```sandbox```.\n  - Pushes all current ```config/``` and ```code/``` for the specified pipelines into that branch.\n- ** Restrictions **:\n   - Cannot open a feature branch from `main` or `develop` branches; feature branch creation is only for promoting code from sandbox-like environments. \n---\n\n### 4.6 QA \u2192 Prod Release\n\n1. **Create `develop` objects**\n   ```python\n   aic_qa = AIC(workspace='develop_workspace'...)\n   mgr = GitLabManager(aic_instance=aic_qa, ...)\n   ```\n2. **Prepare QA**\n\n   ```python\n   # Starting from develop branch/workspace, ensure AIC has updated code\n   mgr.push_to_git() # sync develop environment to capture new updates\n   ```\n\n3. **Cut Release Branch**\n\n   ```python\n   mgr.create_release_branch(\"PINOPS-1832\")       # sets mgr.branch to 'release/v2.1.0'\n   ```\n   - Prunes any pipelines not in ```mgr.aic.pipeline_configs```.\n\n4. **Create Merge Request and Resolve**\n\n   ```python\n   mgr.create_merge_request(\n     title=\"Release v2.1.0\",\n     description=\"QA-approved changes for selected pipelines\"\n   )\n   ```\n   - Opens MR from ```release/v2.1.0``` \u2192 ```main```.\n\n5. **Deploy**\n\n   ```python\n   mgr.deploy_pipelines()  # deploys merged pipelines to production workspace\n   ```\n\n- **When**: Releasing updates from develop to main\n- **What happens**:\n  - Creates a new branch ```release/PINOPS-1831``` off ```develop```.\n  - Pushes all current ```config/``` and ```code/``` for the specified pipelines into that branch.\n- ** Restrictions **:\n   \n\n---\n\n### 4.7 Production Hotfix\n\n1. **Diagnose & Plan**\n   - Identify the pipeline(s) requiring an urgent fix.\n\n2. **Create `main` objects**\n   ```python\n   aic_prod = AIC(workspace='prod_workspace', pipelines=['<PIPELINE_FROM_STEP1>'], ...)\n   mgr = GitLabManager(aic_instance=aic_prod, ...)\n   ```\n   \n2. **Create Hotfix Branch**\n\n   ```python\n   mgr.create_hotfix_branch(\"PINOPS-1831\")\n   # At this point, AIC prod environment should have the fix in place. \n   mgr.push_to_git() # Sync the updated code to the hotfix branch and NOT the production branch\n   ```\n   - Creates ```hotfix/PINOPS-1831``` off ```main``` and pushes the fix to hotfix branch.\n\n3. **Prepare Hotfix branch**\n\n   ```python\n   # Starting from develop branch/workspace, ensure AIC has updated code\n   mgr.push_to_git() # sync hotfix branch to current prod workspace\n   ```\n   **NOTE** The workflow for a Hotfix is the opposite of feature/release branches. Because the change will be made in the AIC production workspace prior, the `hotfix` branch will be updatec with `push_to_git()` while the `main` branch on the repository remains 1 commit behind. This is to ensure that fixes exclusively in production can be captured by a merge request and appropriately audited. \n\n4. **Open & Merge MR**\n\n   ```python\n   mgr.create_merge_request(\n     title=\"Hotfix PINOPS-1831\",\n     description=\"Emergency fix: correct streaming timeout\",\n     target_branch=\"main\"\n   )\n   ```\n\n5. **Deploy & Back-merge**\n\n   ## REVISE MEANS OF BACK-MERGING HOTFIXES\n\n- **When**: Releasing immediate hotfixes straight from production\n- **What happens**:\n  - Creates a new branch ```hotfix/PINOPS-1831``` off ```develop```. \n    **The production branch will not contain the hotfix at this point.**\n   - Hotfix branch gets updated with the new code additions\n  - Hotfix gets merged back into production branch to synch branch to AIC workspace and contains audit trail and roll back function\n- ** Restrictions **:\n   - Hotfixes can only be created from `main` branch\n---\n\n## 6. Key Principles & Best Practices\n\n1. **1:1 Branch\u2194Workspace**: Enforced via `branch_to_workspace.yaml`. No cross\u2011territory deployments.\n2. **MR\u2011only Promotions**: `main`, `develop`, `sandbox` are protected; all changes flow via `feature/*`, `release/*`, or `hotfix/*`.\n3. **Audit Trail**: Every pipeline change lives in Git; interactive branches on AIC mirror the same JSON.\n4. **No CLI**: End users only call Python methods; they never need to know Git commands.\n5. **Error Handling**: Guards in code refuse unsafe operations (wrong branch, unmapped workspace).\n\n---\n\n## 7. Further Reading & Support\n\n- **Contact**: Dylan Doyle.\n\n# GitLabManager & AIC Integration Guide\n\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "AIC API wrapper and GitLab integration framework for pipeline management",
    "version": "2.0.0",
    "project_urls": {
        "Homepage": "https://github.com/dylandoyle11/aic_utils"
    },
    "split_keywords": [
        "aic",
        "gitlab",
        "pipeline",
        "automation",
        "devops"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0eaaea8adcbfde2e27065083f861bbccc6978f4d11aa00e69fbec4ddf6db2bfd",
                "md5": "9c759ad421771411660fbcc459ec3022",
                "sha256": "369648ecbee892d6e3074e5290588a5d6f5203c3b29990de61783e9738419782"
            },
            "downloads": -1,
            "filename": "aic_utils-2.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9c759ad421771411660fbcc459ec3022",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 42417,
            "upload_time": "2025-07-30T18:31:12",
            "upload_time_iso_8601": "2025-07-30T18:31:12.378608Z",
            "url": "https://files.pythonhosted.org/packages/0e/aa/ea8adcbfde2e27065083f861bbccc6978f4d11aa00e69fbec4ddf6db2bfd/aic_utils-2.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3a29d235c19673470fbcaa7213dda6cb4f32e599103568530be3153a9ebba9d9",
                "md5": "f0798b754d6eb4b145aac8905d97bf33",
                "sha256": "b6a6c0f4a0d6c825552af26264d677951f62a1cb60c112983f48e264e81da443"
            },
            "downloads": -1,
            "filename": "aic_utils-2.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "f0798b754d6eb4b145aac8905d97bf33",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 45923,
            "upload_time": "2025-07-30T18:31:13",
            "upload_time_iso_8601": "2025-07-30T18:31:13.418770Z",
            "url": "https://files.pythonhosted.org/packages/3a/29/d235c19673470fbcaa7213dda6cb4f32e599103568530be3153a9ebba9d9/aic_utils-2.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-30 18:31:13",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "dylandoyle11",
    "github_project": "aic_utils",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "aic-utils"
}
        
Elapsed time: 1.92097s