datasure


Namedatasure JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryIPA Data Management System Dashboard
upload_time2025-07-24 21:46:58
maintainerNone
docs_urlNone
authorInnovations for Poverty Action
requires_python>=3.11
licenseMIT License Copyright (c) 2024 Innovations for Poverty Action Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords data-quality survey-data streamlit monitoring hfc
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # DataSure

IPA Dashboard Solution for Data Management Systems.

## Development set up

Development relies on the following software

- `winget` (Windows) or `homebrew` (MacOS/Linux) for package management and installation
- `git` for source control management
- `just` for running common command line patterns
- `uv` for installing Python and managing virtual environments

First, clone this repository to your local computer either via GitHub Desktop.

or from the command line:

```bash
# If using HTTPS
git clone https://github.com/PovertyAction/dms-dashboard.git

# If using SSH
git clone git@github.com:PovertyAction/dms-dashboard.git
```

This repository uses a `Justfile` for collecting common command line actions that we run
to set up the computing environment and build the assets of the handbook. Note that you
should also have Git installed

To get started, make sure you have `Just` installed on your computer by running the
following from the command line:

| Platform  | Commands                                                            |
| --------- | ------------------------------------------------------------------- |
| Windows   | `winget install Git.Git Casey.Just astral-sh.uv` |
| Mac/Linux | `brew install just uv gh`                                          |

This will make sure that you have the latest version of `Just`, as well as
[uv](https://docs.astral.sh/uv/) (installer for Python) and

- We use `Just` in order to make it easier for all IPA users to be productive with data
  and technology systems. The goal of using a `Justfile` is to help make the end goal of
  the user easier to achieve without needing to know or remember all of the technical
  details of how we get to that goal.
- We use `uv` to help ease use of Python. `uv` provides a global system for creating and
  building computing environments for Python.

As a shortcut, if you already have `Just` installed, you can run the following to
install required software and build a python virtual environment that is used to build
the handbook pages:

```bash
just get-started
```

Note: you may need to restart your terminal after running the command above to activate
the installed software.

After the required software is installed, you can activate the Python virtual
environment:

| Shell      | Commands                                |
| ---------- | --------------------------------------- |
| Bash       | `.venv/Scripts/activate`                |
| Powershell | `.venv/Scripts/activate.ps1`            |
| Nushell    | `overlay use .venv/Scripts/activate.nu` |

## Available Justfile Commands

This project uses [Just](https://github.com/casey/just) as a command runner to simplify common development tasks. Here are the available commands:

### Environment Setup

```bash
just get-started          # Complete setup (install software + create venv)
just venv                 # Create virtual environment and install dependencies
just clean                # Remove virtual environment
just activate-venv        # Activate the virtual environment
```

### Development

```bash
uv run datasure                  # Launch the DataSure application
just lab                     # Launch Jupyter Lab
```

### Code Quality

```bash
just lint-py              # Lint Python code with Ruff
just fmt-python           # Format Python code with Ruff
just fmt-py <file>         # Format a specific Python file
just fmt-markdown          # Format all markdown files
just fmt-md <file>         # Format a specific markdown file
just fmt-check-markdown    # Check markdown formatting
just fmt-all              # Format all code and markdown files
just pre-commit-run        # Run pre-commit hooks
```

### Testing

```bash
just test                 # Run all tests
just test-cov             # Run tests with coverage report (terminal)
just test-cov-html        # Run tests with HTML coverage report
just test-cov-xml         # Run tests with XML coverage report (for CI)
```

### Package Building

```bash
just build-package        # Build both wheel and source distribution
just clean-build          # Clean build artifacts
just install-package      # Install the package locally from built wheel
just uninstall-package    # Uninstall the package
just test-cli             # Test the CLI after installation
just package-workflow     # Complete workflow: test, build, and verify
```

### Publishing

```bash
just check-pypi           # Check package metadata and structure
just pypi-info            # View package info and version
just publish-test         # Publish to TestPyPI (for testing)
just publish              # Publish to PyPI (production)
```

### Utilities

```bash
just system-info          # Display system information
just update-reqs          # Update project dependencies
```

## Testing the Streamlit App

Follow these steps to test the app:

### 1. Prepare Your Environment

- Ensure all necessary files are on your local machine. To do this, pull the latest updates from the GitHub repository:
  - **Using Visual Studio Code (VS Code):** Sync files through the Source Control panel.
  - **Using Command Line:** Run the following command in your terminal:

    ```bash
    git pull
    ```

### 2. Navigate to the Repository

- Open your terminal (VS Code terminal, Command Prompt, or PowerShell).
- Navigate to the folder where the repository is located.

### 3. Start the App

- Run one of the following commands to launch the app:

    ```bash
    uv run datasure
    ```

---

### App Features

### Import Data Page

- When the app starts, the **Import Data** page is displayed.
- This page includes four tabs for connecting datasets. Currently, only the **SurveyCTO** and **Local Storage** tabs are functional.
- Use these tabs to upload or connect your datasets.

### Prepare Data Page

- After importing data, go to the **Prepare Data** page to preview your datasets. Each dataset will appear in a separate tab.
- **Note:** This section is still under development. While the functions listed won't work yet, you can review them and suggest additional features.

### Configure Checks Page

- Set up **HFCs** (High-Frequency Checks) on this page:
  1. Enter a name in the **Page Name** input box.
  2. Select a dataset from the **Select Data** dropdown.
  3. Additional input fields will appear as you provide information.
  4. Once the form is complete, click **Add Page** and save the settings.
- This will create an HFC page, but currently, you can only set up one HFC page at a time.
- If the HFC page doesn’t appear immediately, select another page from the left navigation menu and return.

### HFC Page

- The HFC page contains dashboards for various checks, organized into tabs.
- To set up the checks:
  1. Open a tab and expand the **Settings Expander** at the top.
  2. Configure the settings as needed for the check to display the required output.

## Running Tests

The project uses Python `pytest` framework for testing. The test files are located in the `tests/` directory.

To run all tests, execute the following command from the project root directory:

```bash
uv run python -m pytest
```

To run a specific test file, use:

```bash
uv run python -m pytest tests/test_file.py
```

## Package Building and Distribution

DataSure is set up as a proper Python package using [uv](https://docs.astral.sh/uv/) with the `uv_build` backend for simple and fast building and publishing.

### Building the Package

To build the package for distribution:

```bash
# Build both wheel and source distribution
just build-package

# Or use uv directly
uv build
```

This creates two files in the `dist/` directory:

- `datasure-{version}-py3-none-any.whl` (wheel distribution)
- `datasure-{version}.tar.gz` (source distribution)

### Testing the Package

To test the built package locally:

```bash
# Install the package locally
just install-package

# Or install directly from the wheel
uv pip install dist/datasure-*.whl
```

### Using the CLI

Once installed, you can use the command-line interface:

```bash
# Show version
uv run datasure --version

# Launch the dashboard (default: localhost:8501)
uv run datasure

# Launch with custom host/port
uv run datasure --host 0.0.0.0 --port 8080
```

### Package Development Workflow

1. **Make changes** to the code
2. **Update version** in `pyproject.toml`
3. **Run tests** to ensure everything works:

   ```bash
   just test
   ```

4. **Build the package**:

   ```bash
   just build-package
   ```

5. **Test the package installation**:

   ```bash
   just install-package
   uv run datasure --version
   ```

### Version Management

DataSure uses automated version management through `uv version` commands. The package follows [semantic versioning](https://semver.org/):

- **MAJOR** version when you make incompatible API changes
- **MINOR** version when you add functionality in a backward compatible manner
- **PATCH** version when you make backward compatible bug fixes

#### Version Bump Commands

```bash
# Alpha releases (early development testing)
just bump-patch-alpha     # 0.1.0 -> 0.1.1a1
just bump-minor-alpha     # 0.1.0 -> 0.2.0a1
just bump-major-alpha     # 0.1.0 -> 1.0.0a1

# Beta releases (feature-complete testing)
just bump-patch-beta      # 0.1.0 -> 0.1.1b1
just bump-minor-beta      # 0.1.0 -> 0.2.0b1
just bump-major-beta      # 0.1.0 -> 1.0.0b1

# Release candidates (final testing)
just bump-patch-rc        # 0.1.0 -> 0.1.1rc1
just bump-minor-rc        # 0.1.0 -> 0.2.0rc1
just bump-major-rc        # 0.1.0 -> 1.0.0rc1

# Final releases
just bump-patch           # 0.1.0 -> 0.1.1
just bump-minor           # 0.1.0 -> 0.2.0
just bump-major           # 0.1.0 -> 1.0.0
```

These commands automatically:

- Update the version in `src/datasure/__init__.py`
- Run `uv sync` to update the lock file
- Commit the changes to git
- Create a git tag for the new version

#### Git Tag Management

```bash
# Create git tag for current version (if it doesn't exist)
just tag-version          # Creates tag like v0.1.2

# Push tag to remote repository
just push-tag            # Push the current version tag

# Push both commits and tags
just push-all            # Push commits and current version tag
```

**Note:** The version bump commands (`just bump-*`) automatically create git tags, so you typically don't need to run `just tag-version` manually.

### Testing the Build and Publish Workflow

Before publishing your package, it's essential to test the entire workflow using TestPyPI:

#### 1. Set Up TestPyPI Account

1. Login at <https://test.pypi.org/account> (you need to be a member of the IPA PyPI organization)
2. Generate an API token:
   - Go to <https://test.pypi.org/manage/account/>
   - Click "Add API token"
   - Give it a name (e.g., "datasure-test")
   - Copy the token (starts with `pypi-`)

#### 2. Configure Authentication

Set the `UV_PUBLISH_TOKEN` environment variable with your TestPyPI token:

**Windows (PowerShell):**

```powershell
$env:UV_PUBLISH_TOKEN = "pypi-your-token-here"
```

**Windows (Command Prompt):**

```cmd
set UV_PUBLISH_TOKEN=pypi-your-token-here
```

**Linux/macOS:**

```bash
export UV_PUBLISH_TOKEN="pypi-your-token-here"
```

**Permanent Setup (recommended):**
Add the token to your shell profile (`.bashrc`, `.zshrc`, or Windows Environment Variables) to avoid setting it each time.

#### 3. Test the Complete Workflow

```bash
# 1. Clean any existing build artifacts
just clean-build

# 2. Bump version for testing (use alpha for test releases)
just bump-patch-alpha

# 3. Verify the version was updated
uv run datasure --version

# 4. Build the package
just build-package

# 5. Publish to TestPyPI
just publish-test

# 6. Install from TestPyPI to verify it works
uv pip install --index-url https://test.pypi.org/simple/ datasure
```

#### 4. Troubleshooting Common Issues

**Version Already Exists Error:**

```bash
error: Local file and index file do not match for datasure-X.Y.Z
```

Solution: Bump the version again - you cannot republish the same version.

**Authentication Error:**

```bash
error: 401 Unauthorized
```

Solution: Verify your `UV_PUBLISH_TOKEN` is set correctly and the token is valid.

### Publishing to PyPI (Production)

Once you've successfully tested with TestPyPI

#### 1. Set Up PyPI Account

1. Create an account at <https://pypi.org/account/register/>
2. Generate an API token at <https://pypi.org/manage/account/>
3. Set the token as `UV_PUBLISH_TOKEN` (same as TestPyPI setup)

#### 2. Production Publishing Workflow

```bash
# 1. Ensure you're on the main branch with latest changes
git checkout main
git pull

# 2. Run tests to ensure everything works
just test

# 3. Bump to final version (automatically creates git tag and commits)
just bump-patch  # or bump-minor/bump-major as appropriate

# 4. Push changes and tags to trigger automated release
just push-all

# 5. GitHub Actions will automatically:
#    - Run Code Coverage workflow (tests + quality checks)
#    - If successful, run Build and Release workflow
#    - Build package and publish to PyPI
#    - Create GitHub release with artifacts
```

## Automated Release Pipeline

DataSure uses GitHub Actions for automated testing and releasing:

### Workflow Dependencies

1. **Code Coverage Workflow** (`.github/workflows/build.yml`)
   - Runs on: branches `main`, tags `v*`, and pull requests
   - Executes: pre-commit hooks, tests, SonarQube analysis
   - **Must pass** before releases can proceed

2. **Build and Release Workflow** (`.github/workflows/build-and-release.yml`)
   - **Triggered by**: Code Coverage workflow completion
   - **Only runs if**: Code Coverage succeeded AND triggered by tag push
   - Executes: package building, PyPI publishing, GitHub release creation

### Release Process

```bash
# Step 1: Create a release (this triggers both workflows)
just bump-patch  # Creates git tag v1.0.1

# Step 2: Push to trigger automation
just push-all    # Pushes commits and tags

# Step 3: Monitor workflows in GitHub Actions
# - Code Coverage runs first (quality gate)
# - Build and Release runs only if Code Coverage passes
# - Package published to PyPI automatically
# - GitHub release created with artifacts
```

### Manual Release Override

For emergency releases bypassing quality checks:

```bash
# Trigger Build and Release workflow manually
# Go to GitHub Actions → Build and Release → Run workflow
# Enter version (e.g., v1.0.1) and click "Run workflow"
```

### Quality Gates

- **Pre-commit hooks**: Code formatting and linting
- **Test suite**: All tests must pass
- **SonarQube analysis**: Code quality and security checks
- **Failed quality checks** = **No release**

#### 3. Verifying Package Before Publishing

Before publishing, you can verify your package:

```bash
# Check package metadata and structure
just check-pypi

# View package info
just pypi-info
```

**Note:** The project now uses `uv publish` for all publishing operations.

## Data Storage and Cache

DataSure automatically manages data storage and caching for optimal performance across different environments:

### Cache Directory Locations

- **Development Mode** (when running from source): `./cache/` (in project root)
- **Production Mode** (when installed as package):
  - **Windows**: `%APPDATA%/datasure/cache/`
  - **Linux/macOS**: `~/.local/share/datasure/cache/`

### What's Stored

The cache directory contains:

- **Project configurations**: HFC page settings and form configurations
- **Database files**: DuckDB databases for processed survey data
- **SurveyCTO cache**: Cached form metadata and server connections
- **User settings**: Check configurations and preferences

### Cache Management

- Cache directories are created automatically when needed
- No manual setup required - DataSure detects the environment and uses appropriate paths
- Development and production modes use separate cache locations
- Cache is preserved between application sessions

## Code Quality Reports

Code quality metrics and reports are available on SonarQube Cloud:

- **Dashboard**: [https://sonarcloud.io/project/overview?id=PovertyAction_datasure](https://sonarcloud.io/project/overview?id=PovertyAction_datasure)

The SonarQube dashboard provides insights into code coverage, code smells, bugs, vulnerabilities, and maintainability ratings.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "datasure",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.11",
    "maintainer_email": null,
    "keywords": "data-quality, survey-data, streamlit, monitoring, hfc",
    "author": "Innovations for Poverty Action",
    "author_email": "Innovations for Poverty Action <researchsupport@poverty-action.org>",
    "download_url": "https://files.pythonhosted.org/packages/21/10/2d8f2ce8c463040072f7fda6afb69a6decc6dfb51c136fd2cb97d4a29245/datasure-0.1.2.tar.gz",
    "platform": null,
    "description": "# DataSure\n\nIPA Dashboard Solution for Data Management Systems.\n\n## Development set up\n\nDevelopment relies on the following software\n\n- `winget` (Windows) or `homebrew` (MacOS/Linux) for package management and installation\n- `git` for source control management\n- `just` for running common command line patterns\n- `uv` for installing Python and managing virtual environments\n\nFirst, clone this repository to your local computer either via GitHub Desktop.\n\nor from the command line:\n\n```bash\n# If using HTTPS\ngit clone https://github.com/PovertyAction/dms-dashboard.git\n\n# If using SSH\ngit clone git@github.com:PovertyAction/dms-dashboard.git\n```\n\nThis repository uses a `Justfile` for collecting common command line actions that we run\nto set up the computing environment and build the assets of the handbook. Note that you\nshould also have Git installed\n\nTo get started, make sure you have `Just` installed on your computer by running the\nfollowing from the command line:\n\n| Platform  | Commands                                                            |\n| --------- | ------------------------------------------------------------------- |\n| Windows   | `winget install Git.Git Casey.Just astral-sh.uv` |\n| Mac/Linux | `brew install just uv gh`                                          |\n\nThis will make sure that you have the latest version of `Just`, as well as\n[uv](https://docs.astral.sh/uv/) (installer for Python) and\n\n- We use `Just` in order to make it easier for all IPA users to be productive with data\n  and technology systems. The goal of using a `Justfile` is to help make the end goal of\n  the user easier to achieve without needing to know or remember all of the technical\n  details of how we get to that goal.\n- We use `uv` to help ease use of Python. `uv` provides a global system for creating and\n  building computing environments for Python.\n\nAs a shortcut, if you already have `Just` installed, you can run the following to\ninstall required software and build a python virtual environment that is used to build\nthe handbook pages:\n\n```bash\njust get-started\n```\n\nNote: you may need to restart your terminal after running the command above to activate\nthe installed software.\n\nAfter the required software is installed, you can activate the Python virtual\nenvironment:\n\n| Shell      | Commands                                |\n| ---------- | --------------------------------------- |\n| Bash       | `.venv/Scripts/activate`                |\n| Powershell | `.venv/Scripts/activate.ps1`            |\n| Nushell    | `overlay use .venv/Scripts/activate.nu` |\n\n## Available Justfile Commands\n\nThis project uses [Just](https://github.com/casey/just) as a command runner to simplify common development tasks. Here are the available commands:\n\n### Environment Setup\n\n```bash\njust get-started          # Complete setup (install software + create venv)\njust venv                 # Create virtual environment and install dependencies\njust clean                # Remove virtual environment\njust activate-venv        # Activate the virtual environment\n```\n\n### Development\n\n```bash\nuv run datasure                  # Launch the DataSure application\njust lab                     # Launch Jupyter Lab\n```\n\n### Code Quality\n\n```bash\njust lint-py              # Lint Python code with Ruff\njust fmt-python           # Format Python code with Ruff\njust fmt-py <file>         # Format a specific Python file\njust fmt-markdown          # Format all markdown files\njust fmt-md <file>         # Format a specific markdown file\njust fmt-check-markdown    # Check markdown formatting\njust fmt-all              # Format all code and markdown files\njust pre-commit-run        # Run pre-commit hooks\n```\n\n### Testing\n\n```bash\njust test                 # Run all tests\njust test-cov             # Run tests with coverage report (terminal)\njust test-cov-html        # Run tests with HTML coverage report\njust test-cov-xml         # Run tests with XML coverage report (for CI)\n```\n\n### Package Building\n\n```bash\njust build-package        # Build both wheel and source distribution\njust clean-build          # Clean build artifacts\njust install-package      # Install the package locally from built wheel\njust uninstall-package    # Uninstall the package\njust test-cli             # Test the CLI after installation\njust package-workflow     # Complete workflow: test, build, and verify\n```\n\n### Publishing\n\n```bash\njust check-pypi           # Check package metadata and structure\njust pypi-info            # View package info and version\njust publish-test         # Publish to TestPyPI (for testing)\njust publish              # Publish to PyPI (production)\n```\n\n### Utilities\n\n```bash\njust system-info          # Display system information\njust update-reqs          # Update project dependencies\n```\n\n## Testing the Streamlit App\n\nFollow these steps to test the app:\n\n### 1. Prepare Your Environment\n\n- Ensure all necessary files are on your local machine. To do this, pull the latest updates from the GitHub repository:\n  - **Using Visual Studio Code (VS Code):** Sync files through the Source Control panel.\n  - **Using Command Line:** Run the following command in your terminal:\n\n    ```bash\n    git pull\n    ```\n\n### 2. Navigate to the Repository\n\n- Open your terminal (VS Code terminal, Command Prompt, or PowerShell).\n- Navigate to the folder where the repository is located.\n\n### 3. Start the App\n\n- Run one of the following commands to launch the app:\n\n    ```bash\n    uv run datasure\n    ```\n\n---\n\n### App Features\n\n### Import Data Page\n\n- When the app starts, the **Import Data** page is displayed.\n- This page includes four tabs for connecting datasets. Currently, only the **SurveyCTO** and **Local Storage** tabs are functional.\n- Use these tabs to upload or connect your datasets.\n\n### Prepare Data Page\n\n- After importing data, go to the **Prepare Data** page to preview your datasets. Each dataset will appear in a separate tab.\n- **Note:** This section is still under development. While the functions listed won't work yet, you can review them and suggest additional features.\n\n### Configure Checks Page\n\n- Set up **HFCs** (High-Frequency Checks) on this page:\n  1. Enter a name in the **Page Name** input box.\n  2. Select a dataset from the **Select Data** dropdown.\n  3. Additional input fields will appear as you provide information.\n  4. Once the form is complete, click **Add Page** and save the settings.\n- This will create an HFC page, but currently, you can only set up one HFC page at a time.\n- If the HFC page doesn\u2019t appear immediately, select another page from the left navigation menu and return.\n\n### HFC Page\n\n- The HFC page contains dashboards for various checks, organized into tabs.\n- To set up the checks:\n  1. Open a tab and expand the **Settings Expander** at the top.\n  2. Configure the settings as needed for the check to display the required output.\n\n## Running Tests\n\nThe project uses Python `pytest` framework for testing. The test files are located in the `tests/` directory.\n\nTo run all tests, execute the following command from the project root directory:\n\n```bash\nuv run python -m pytest\n```\n\nTo run a specific test file, use:\n\n```bash\nuv run python -m pytest tests/test_file.py\n```\n\n## Package Building and Distribution\n\nDataSure is set up as a proper Python package using [uv](https://docs.astral.sh/uv/) with the `uv_build` backend for simple and fast building and publishing.\n\n### Building the Package\n\nTo build the package for distribution:\n\n```bash\n# Build both wheel and source distribution\njust build-package\n\n# Or use uv directly\nuv build\n```\n\nThis creates two files in the `dist/` directory:\n\n- `datasure-{version}-py3-none-any.whl` (wheel distribution)\n- `datasure-{version}.tar.gz` (source distribution)\n\n### Testing the Package\n\nTo test the built package locally:\n\n```bash\n# Install the package locally\njust install-package\n\n# Or install directly from the wheel\nuv pip install dist/datasure-*.whl\n```\n\n### Using the CLI\n\nOnce installed, you can use the command-line interface:\n\n```bash\n# Show version\nuv run datasure --version\n\n# Launch the dashboard (default: localhost:8501)\nuv run datasure\n\n# Launch with custom host/port\nuv run datasure --host 0.0.0.0 --port 8080\n```\n\n### Package Development Workflow\n\n1. **Make changes** to the code\n2. **Update version** in `pyproject.toml`\n3. **Run tests** to ensure everything works:\n\n   ```bash\n   just test\n   ```\n\n4. **Build the package**:\n\n   ```bash\n   just build-package\n   ```\n\n5. **Test the package installation**:\n\n   ```bash\n   just install-package\n   uv run datasure --version\n   ```\n\n### Version Management\n\nDataSure uses automated version management through `uv version` commands. The package follows [semantic versioning](https://semver.org/):\n\n- **MAJOR** version when you make incompatible API changes\n- **MINOR** version when you add functionality in a backward compatible manner\n- **PATCH** version when you make backward compatible bug fixes\n\n#### Version Bump Commands\n\n```bash\n# Alpha releases (early development testing)\njust bump-patch-alpha     # 0.1.0 -> 0.1.1a1\njust bump-minor-alpha     # 0.1.0 -> 0.2.0a1\njust bump-major-alpha     # 0.1.0 -> 1.0.0a1\n\n# Beta releases (feature-complete testing)\njust bump-patch-beta      # 0.1.0 -> 0.1.1b1\njust bump-minor-beta      # 0.1.0 -> 0.2.0b1\njust bump-major-beta      # 0.1.0 -> 1.0.0b1\n\n# Release candidates (final testing)\njust bump-patch-rc        # 0.1.0 -> 0.1.1rc1\njust bump-minor-rc        # 0.1.0 -> 0.2.0rc1\njust bump-major-rc        # 0.1.0 -> 1.0.0rc1\n\n# Final releases\njust bump-patch           # 0.1.0 -> 0.1.1\njust bump-minor           # 0.1.0 -> 0.2.0\njust bump-major           # 0.1.0 -> 1.0.0\n```\n\nThese commands automatically:\n\n- Update the version in `src/datasure/__init__.py`\n- Run `uv sync` to update the lock file\n- Commit the changes to git\n- Create a git tag for the new version\n\n#### Git Tag Management\n\n```bash\n# Create git tag for current version (if it doesn't exist)\njust tag-version          # Creates tag like v0.1.2\n\n# Push tag to remote repository\njust push-tag            # Push the current version tag\n\n# Push both commits and tags\njust push-all            # Push commits and current version tag\n```\n\n**Note:** The version bump commands (`just bump-*`) automatically create git tags, so you typically don't need to run `just tag-version` manually.\n\n### Testing the Build and Publish Workflow\n\nBefore publishing your package, it's essential to test the entire workflow using TestPyPI:\n\n#### 1. Set Up TestPyPI Account\n\n1. Login at <https://test.pypi.org/account> (you need to be a member of the IPA PyPI organization)\n2. Generate an API token:\n   - Go to <https://test.pypi.org/manage/account/>\n   - Click \"Add API token\"\n   - Give it a name (e.g., \"datasure-test\")\n   - Copy the token (starts with `pypi-`)\n\n#### 2. Configure Authentication\n\nSet the `UV_PUBLISH_TOKEN` environment variable with your TestPyPI token:\n\n**Windows (PowerShell):**\n\n```powershell\n$env:UV_PUBLISH_TOKEN = \"pypi-your-token-here\"\n```\n\n**Windows (Command Prompt):**\n\n```cmd\nset UV_PUBLISH_TOKEN=pypi-your-token-here\n```\n\n**Linux/macOS:**\n\n```bash\nexport UV_PUBLISH_TOKEN=\"pypi-your-token-here\"\n```\n\n**Permanent Setup (recommended):**\nAdd the token to your shell profile (`.bashrc`, `.zshrc`, or Windows Environment Variables) to avoid setting it each time.\n\n#### 3. Test the Complete Workflow\n\n```bash\n# 1. Clean any existing build artifacts\njust clean-build\n\n# 2. Bump version for testing (use alpha for test releases)\njust bump-patch-alpha\n\n# 3. Verify the version was updated\nuv run datasure --version\n\n# 4. Build the package\njust build-package\n\n# 5. Publish to TestPyPI\njust publish-test\n\n# 6. Install from TestPyPI to verify it works\nuv pip install --index-url https://test.pypi.org/simple/ datasure\n```\n\n#### 4. Troubleshooting Common Issues\n\n**Version Already Exists Error:**\n\n```bash\nerror: Local file and index file do not match for datasure-X.Y.Z\n```\n\nSolution: Bump the version again - you cannot republish the same version.\n\n**Authentication Error:**\n\n```bash\nerror: 401 Unauthorized\n```\n\nSolution: Verify your `UV_PUBLISH_TOKEN` is set correctly and the token is valid.\n\n### Publishing to PyPI (Production)\n\nOnce you've successfully tested with TestPyPI\n\n#### 1. Set Up PyPI Account\n\n1. Create an account at <https://pypi.org/account/register/>\n2. Generate an API token at <https://pypi.org/manage/account/>\n3. Set the token as `UV_PUBLISH_TOKEN` (same as TestPyPI setup)\n\n#### 2. Production Publishing Workflow\n\n```bash\n# 1. Ensure you're on the main branch with latest changes\ngit checkout main\ngit pull\n\n# 2. Run tests to ensure everything works\njust test\n\n# 3. Bump to final version (automatically creates git tag and commits)\njust bump-patch  # or bump-minor/bump-major as appropriate\n\n# 4. Push changes and tags to trigger automated release\njust push-all\n\n# 5. GitHub Actions will automatically:\n#    - Run Code Coverage workflow (tests + quality checks)\n#    - If successful, run Build and Release workflow\n#    - Build package and publish to PyPI\n#    - Create GitHub release with artifacts\n```\n\n## Automated Release Pipeline\n\nDataSure uses GitHub Actions for automated testing and releasing:\n\n### Workflow Dependencies\n\n1. **Code Coverage Workflow** (`.github/workflows/build.yml`)\n   - Runs on: branches `main`, tags `v*`, and pull requests\n   - Executes: pre-commit hooks, tests, SonarQube analysis\n   - **Must pass** before releases can proceed\n\n2. **Build and Release Workflow** (`.github/workflows/build-and-release.yml`)\n   - **Triggered by**: Code Coverage workflow completion\n   - **Only runs if**: Code Coverage succeeded AND triggered by tag push\n   - Executes: package building, PyPI publishing, GitHub release creation\n\n### Release Process\n\n```bash\n# Step 1: Create a release (this triggers both workflows)\njust bump-patch  # Creates git tag v1.0.1\n\n# Step 2: Push to trigger automation\njust push-all    # Pushes commits and tags\n\n# Step 3: Monitor workflows in GitHub Actions\n# - Code Coverage runs first (quality gate)\n# - Build and Release runs only if Code Coverage passes\n# - Package published to PyPI automatically\n# - GitHub release created with artifacts\n```\n\n### Manual Release Override\n\nFor emergency releases bypassing quality checks:\n\n```bash\n# Trigger Build and Release workflow manually\n# Go to GitHub Actions \u2192 Build and Release \u2192 Run workflow\n# Enter version (e.g., v1.0.1) and click \"Run workflow\"\n```\n\n### Quality Gates\n\n- **Pre-commit hooks**: Code formatting and linting\n- **Test suite**: All tests must pass\n- **SonarQube analysis**: Code quality and security checks\n- **Failed quality checks** = **No release**\n\n#### 3. Verifying Package Before Publishing\n\nBefore publishing, you can verify your package:\n\n```bash\n# Check package metadata and structure\njust check-pypi\n\n# View package info\njust pypi-info\n```\n\n**Note:** The project now uses `uv publish` for all publishing operations.\n\n## Data Storage and Cache\n\nDataSure automatically manages data storage and caching for optimal performance across different environments:\n\n### Cache Directory Locations\n\n- **Development Mode** (when running from source): `./cache/` (in project root)\n- **Production Mode** (when installed as package):\n  - **Windows**: `%APPDATA%/datasure/cache/`\n  - **Linux/macOS**: `~/.local/share/datasure/cache/`\n\n### What's Stored\n\nThe cache directory contains:\n\n- **Project configurations**: HFC page settings and form configurations\n- **Database files**: DuckDB databases for processed survey data\n- **SurveyCTO cache**: Cached form metadata and server connections\n- **User settings**: Check configurations and preferences\n\n### Cache Management\n\n- Cache directories are created automatically when needed\n- No manual setup required - DataSure detects the environment and uses appropriate paths\n- Development and production modes use separate cache locations\n- Cache is preserved between application sessions\n\n## Code Quality Reports\n\nCode quality metrics and reports are available on SonarQube Cloud:\n\n- **Dashboard**: [https://sonarcloud.io/project/overview?id=PovertyAction_datasure](https://sonarcloud.io/project/overview?id=PovertyAction_datasure)\n\nThe SonarQube dashboard provides insights into code coverage, code smells, bugs, vulnerabilities, and maintainability ratings.\n",
    "bugtrack_url": null,
    "license": "MIT License  Copyright (c) 2024 Innovations for Poverty Action  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
    "summary": "IPA Data Management System Dashboard",
    "version": "0.1.2",
    "project_urls": {
        "Issues": "https://github.com/PovertyAction/datasure/issues",
        "Source": "https://github.com/PovertyAction/datasure"
    },
    "split_keywords": [
        "data-quality",
        " survey-data",
        " streamlit",
        " monitoring",
        " hfc"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "cd6fc0a6af8f551c68a2bc6b004797b2c8f30534206ae2fd6a88d04df7015f72",
                "md5": "1b220c38be78ff5eeb1ee629b3c1a350",
                "sha256": "e63141b15655ba1168e9ee9cf023183cc62956e4b1260ee4aa2a7ecc6dc1958c"
            },
            "downloads": -1,
            "filename": "datasure-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1b220c38be78ff5eeb1ee629b3c1a350",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.11",
            "size": 990869,
            "upload_time": "2025-07-24T21:46:57",
            "upload_time_iso_8601": "2025-07-24T21:46:57.282574Z",
            "url": "https://files.pythonhosted.org/packages/cd/6f/c0a6af8f551c68a2bc6b004797b2c8f30534206ae2fd6a88d04df7015f72/datasure-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "21102d8f2ce8c463040072f7fda6afb69a6decc6dfb51c136fd2cb97d4a29245",
                "md5": "2462d96b3b905788603fba19d1fd0bba",
                "sha256": "7001d19d72f0877b5052c830766228341913adc016522113fb447892530fe026"
            },
            "downloads": -1,
            "filename": "datasure-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "2462d96b3b905788603fba19d1fd0bba",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.11",
            "size": 975880,
            "upload_time": "2025-07-24T21:46:58",
            "upload_time_iso_8601": "2025-07-24T21:46:58.523080Z",
            "url": "https://files.pythonhosted.org/packages/21/10/2d8f2ce8c463040072f7fda6afb69a6decc6dfb51c136fd2cb97d4a29245/datasure-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-24 21:46:58",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "PovertyAction",
    "github_project": "datasure",
    "github_not_found": true,
    "lcname": "datasure"
}
        
Elapsed time: 1.66014s