pg-rd-cf-ai-pset-2-sandhan-b


Namepg-rd-cf-ai-pset-2-sandhan-b JSON
Version 0.1.0 PyPI version JSON
download
home_pagehttps://github.com/pg-ds-dev-class/pset-2-dev-utils-borgohains-pg
Summarytest
upload_time2024-01-18 14:28:34
maintainer
docs_urlNone
authorBorgohain, Sandhan
requires_python>=3.7, <=3.10
licenseProprietary
keywords p&g procter&gamble d&a data&analytics
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # P&G Data Science Developer Class - Problem Set 2 - Utils Library



A re-usable utilities library for DS-DEV problem-set 2.



Replace the below with your own build badge and link to SonarQube dashboard:





## Data Science Utils



Common utilities for DS-DEV Problem Sets



You will typically create a new project for each _application_ that you will develop,

however there are many instances where you want to re-use functionality across

multiple applications.  If you copy and paste functions into different projects you

will increase the difficulty of maintaining the functions as well as possibly

limit your ability to improve the function across projects (introduce new features, etc.)



In this problem set we will create a utilities _library_ that will enable us to

import re-usable functionality from a central repository.  We will be able to use this

library for future problem sets and continue to evolve it over time to provide common, re-usable

functionality.



## Problem Set Objectives



* Understand difference between Application & Library

* Learn how to package a Python library

* Implement a Context Manager

* Implement a Decorator

* Use a Cookiecutter template

* Introduction to matrix testing

* Execute Unit Test via Fake/Mock



### Grading Criteria



* [15 pts]  Build badge and SonarQube links are updated and functional

* [15 pts]  blob_file context manager is implemented correctly

* [15 pts]  Library semantic versioning is implemented correctly

* [15 pts]  Appropriate tests have been implemented to cover defined functions & classes

* [10 pts]  Git commits are clear and well structured

* [10 pts]  Code is pythonic, has no major SonarQube issues and well documented

* [10 pts]  GitHub Action test is implemented for Python v3.7, v3.8 & v3.9

* [10 pts]  Optional: Using release-please for release automation



## Preface



<!-- START doctoc generated TOC please keep comment here to allow auto update -->

<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->

**Table of Contents**  *generated with [DocToc](https://github.com/thlorenz/doctoc)*



- [Library Versioning](#library-versioning)

- [Required Modifications](#required-modifications)

  - [Updating Setup.py](#updating-setuppy)

  - [Complete GitHub Actions](#complete-github-actions)

- [Installing library locally for development](#installing-library-locally-for-development)

- [Utility implementations](#utility-implementations)

  - [Context Manager to download a file from Azure blob](#context-manager-to-download-a-file-from-azure-blob)

    - [Testing](#testing)

- [Installing library in external application](#installing-library-in-external-application)

  - [Install the utils](#install-the-utils)

  - [Testing imported library in application](#testing-imported-library-in-application)

    - [Updating](#updating)

- [Recommended](#recommended)



<!-- END doctoc generated TOC please keep comment here to allow auto update -->



## Library Versioning



For a package/library, it is especially important that your branching workflow

reflect the 'production' status of your library.



After your initial tag commit of this repo, you must conform to a formal

git workflow.  This means:



1. Your `main` branch should be ***merge-only***.  That is, never commit work to it

   directly, only merge from `feature/*`, `develop/*`, `hotfix/*`, or similar.

2. Each new merged commit on main must have a

   [Semantic Versioning](https://semver.org/) release version with an

   accompanying tag.  TL;DR:

   * `major.minor.patch`

   * Patch is for bugfix

   * Minor is for new features

   * Major is for backwards-incompatible changes

   * Don't worry about high version numbers

   * tags should be of the form `v0.1.2`

3. [Optional] You can use [release-please](https://github.com/google-github-actions/release-please-action) to automate tagging and releasing your repository using [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/)



## Required Modifications



The files provided in your problem-set repository require a few adjustments.



### Updating Setup.py



* Update lines 118-119 with url to your repository and your name.

* Update line 121 (classifiers) to support 3.7, 3.8 and 3.9.

* Update line 127 (python_requires) to support 3.7, 3.8 and 3.9.



If you are not familiar with [setup.py](setup.py) you may want to review [this overview](https://godatadriven.com/blog/a-practical-guide-to-using-setup-py/)  



In particular note that required libraries are installed via `install_requires` attribute in setup.py and optional libraries can be installed via `extras_requires`.  Review the `load_deps` function in setup.py to see how this is automaticaly managed.



### Complete GitHub Actions



The PyPod created GitHub Actions have been replaced with a simplified project-build.yml file.



As part of this custom build file we would like to use

[Github Action's Matrix](https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#jobsjob_idstrategymatrix) to test multiple installation of our library with multiple versions of Python (v3.7, v3.8 & v3.9) for completeness.



More information on using a strategy matrix can be found [here](https://ncorti.com/blog/howto-github-actions-build-matrix).



However we only want SonarQube results stored for Python 3.7.  Implement the workflow

so that it only submits code coverage/SonarQube reports if running with python 3.7 (there is no need to

submit redundant code coverage for each matrix test run).



Check out [GitHub Workflow syntax](https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-syntax-for-github-actions) for

additional details if needed.



## Installing library locally for development



The following 'editable' install links the package to the original location, basically 

meaning any changes to the original package would reflect directly in your environment. 

It is useful most often in the case when you are developing it on your system



You should never pip install anything else for a library; rather, dependencies

should be added into [setup.py](setup.py) and then `pip update` or similar.

You may, however, have to `pip install` _development_ specific requirements (or

place them as an `extras_requires` install).



Install editable library:



```bash

# General usage

pip install -e <project path or a VCS url>



# Install the library you are working on (execute from where setup.py is at)

pip install -e .

```



Install editable library with development requirements:



```bash

pip install -e .[devel]

```

The `[devel]` above comes from the `requirements-devel.txt` file. If you name your 

development requirements in other way (eg. `requirements-dev.txt`), then you should 

put other name in the square brackets (`[dev]` in the example).



## Utility implementations

### Context Manager to download a file from Azure blob



Implement _ds_dev_utils.azure_utils.blob.blob_file_ as a context manager.



Within the context manager, download the specified blob file to a local

[temporary directory](https://docs.python.org/3/library/tempfile.html#tempfile.TemporaryDirectory)

preserving the name and extension of the file.



```python

...

def blob_file(

    container_name: str,

    blob_name: str,

    sas_connection_str="",

    as_file=True,

):

    """Context manager to downoad a specified blob file from a specified container using

    a Shared-Access-Signature connection string.



    Function will download blob file into a temporary directory and provide either the

    File handle or File path to the calling context for reading.  After execution the 

    temporary file should be removed.



    :param str container_name: name of the blob container

    :param str blob_name: name of the blob file

    :param str sas_connection_str: the shared access signature

    :param bool as_file: If True the yielded object is :class:File.  Otherwise it will

        be the file path as a string



    Example Usage: 

    

    with blob_file( ..., as_file=False ) as blob_path:

        shutils.copyfile(blob_path, data_download_directory)



    """

    ...

```



For this class, you can create a blob storage account for testing in a personal Azure account

or you can test in an existing P&G development blob account if you already have access to one.



Expenses within personal Azure blob account should be insignificant (<1$) for the purposes of

this class.  If concerns, reach out to Instructor for help.



This [Azure blob quick-start tutorial](https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-python)

may be helpful in your implementation.



For permission management use the Azure portal to generate a [Shared Access Signature](https://adamtheautomator.com/azure-sas-token/#Generating_a_SAS_Token_using_the_Azure_Portal).



***Note*** make sure you do not check your SAS Token into Git.



Make sure to implement appropriate docstrings, unit tests and that your function implementation is at the right path.  



#### Testing



What should unit testing look like for a cloud-based API?  Should you hard-code credentials for testing (hint: no)?  



What are the parts of the implementation that you have added - can you seperate

these out in a way so that you can test your code without being reliant on the Azure SDK code?



Here is the azure_utils.mock.FakeBlobServiceClient implementation - can you use this with `@mock.patch` to enable improved unit testing?



```python

class FakeBlobServiceClient:

  """Provies a Fake interface for azure.storage.blob.BlobServiceClient including

  implementation for:

  - BlobServiceClient.get_connection_str

  - BlobServiceClient.get_container_client

  - BlobContainerClient.get_blob_client

  - Blob.read_all



  e.g BlobServiceClient.from_connection_string(cnx_str). \

        get_container_client(cntr_name).get_blob_client(blob_name). \

        download_blob().readall()



  """

    file_dict = {}



    def __init__(self, file_dict=None):

        self.file_dict = file_dict



    def get_blob_client(self, blob, *args, **kwargs):



        f_location = self.file_dict.get(blob, None)

        f_dict = {"blob_service_selected_file": f_location}



        return FakeBlobServiceClient(f_dict)



    def download_blob(self, *args, **kwargs):

        return self



    def get_container_client(self, *args, **kwargs):

        return self



    def readall(self):

        if len(self.file_dict) != 1:

            raise FileNotFoundError("File not found")



        f = open(self.file_dict["blob_service_selected_file"])

        f_contents = f.read()

        f.close()

        return bytes(f_contents, "utf-8")

```



**Note:** Feel free to edit the FakeBlobServiceClient implementation if needed to

support your specific implementation.



## Installing library in external application



### Install the utils



You can now install your `ds_dev_utils` as below.  Note that the #egg part is

important, it is not a comment!



```bash

pip install -e git+https://github.com/pg-ds-dev-class/ds-dev-utils-<GITHUBID>#egg=ds_dev_utils

```



This will include the latest main commit (hopefully tagged) and will be

automatically updated whenever you run `pip update`.  If you want to be more

specific about the version, you can use the `@v1.2.3` syntax when you install.  



Leaving this to automatically check out the latest main is easiest, and a good reason to have

merge-only main releases (you don't want to accidentally commit a "break" into main and

have all library-using applications no longer work!)



You will need to make sure that you have your [GitHub SSH access](https://docs.github.com/en/free-pro-team@latest/github/authenticating-to-github/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent) setup to enable installation

directly from GitHub.



### Testing imported library in application



Once library is installed, `pytest --pyargs ds_dev_utils` should run all the tests in your utils package.



You can run them by default within your application if you like, by adding `ds_dev_utils` to `testpaths`

in the [config

file](https://docs.pytest.org/en/6.2.x/reference.html#confval-testpaths).



This is generally not needed as we would expect the library to execute it's own tests

as part of CI/CD deployment.  However it never hurts to verify ;)



#### Updating



Every time you push a new main version of library to GitHub, you may update the installed

version in your application via `pip update`.



## Recommended



This content is not necessary to complete the pset, but you might want to go through for your self development.



<details>

<summary>Implementing Git-based versioning</summary>

We can check the library version in two ways:



```bash

python setup.py --version

python -c "import ds_dev_utils; print(ds_dev_utils.__version__)"

```



Both commands should print the same thing, which will look something like this:

`0.0.1.dev4+gdfedba7.d20190209`.



Once you have your `setup.py` and `__init__.py` configured so the above works,

***git tag*** your main `v0.1.0` (eg, `git tag v0.1.0`).  Now verify:



```bash

python setup.py --version  # Should print a clean semantic version

```



From now on, all commits on main branch [must have an accompanying semantic version tag](https://www.toolsqa.com/git/github-tags/).



Example of adding a tag (typically done on main branch):



```bash

git tag "v1.1.1"

git push --tags

```



When you later install this project into a problem set, if installed from a

clean repo with a tag version, you'll get a nice version like `0.1.2`.  If,

however, you inspect the `__version__` in your package from your git repo,

with non-committed edits, etc. you'll get an informative 'dirty' version number like

`'0.2.1.dev0+g850a76d.d20180908'`. This is useful for debugging, building sphinx

docs in dev, etc, and you never have to specify a version except via tagging

your commit.



</details>



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/pg-ds-dev-class/pset-2-dev-utils-borgohains-pg",
    "name": "pg-rd-cf-ai-pset-2-sandhan-b",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7, <=3.10",
    "maintainer_email": "",
    "keywords": "P&G Procter&Gamble D&A Data&Analytics",
    "author": "Borgohain, Sandhan",
    "author_email": "borgohain.s@pg.com",
    "download_url": "https://files.pythonhosted.org/packages/1f/0a/1401d3a9f2977b4078a890308ab99fb512902ce25f4e5ac16382fbfef5de/pg-rd-cf-ai-pset-2-sandhan-b-0.1.0.tar.gz",
    "platform": null,
    "description": "# P&G Data Science Developer Class - Problem Set 2 - Utils Library\n\n\n\nA re-usable utilities library for DS-DEV problem-set 2.\n\n\n\nReplace the below with your own build badge and link to SonarQube dashboard:\n\n\n\n\n\n## Data Science Utils\n\n\n\nCommon utilities for DS-DEV Problem Sets\n\n\n\nYou will typically create a new project for each _application_ that you will develop,\n\nhowever there are many instances where you want to re-use functionality across\n\nmultiple applications.  If you copy and paste functions into different projects you\n\nwill increase the difficulty of maintaining the functions as well as possibly\n\nlimit your ability to improve the function across projects (introduce new features, etc.)\n\n\n\nIn this problem set we will create a utilities _library_ that will enable us to\n\nimport re-usable functionality from a central repository.  We will be able to use this\n\nlibrary for future problem sets and continue to evolve it over time to provide common, re-usable\n\nfunctionality.\n\n\n\n## Problem Set Objectives\n\n\n\n* Understand difference between Application & Library\n\n* Learn how to package a Python library\n\n* Implement a Context Manager\n\n* Implement a Decorator\n\n* Use a Cookiecutter template\n\n* Introduction to matrix testing\n\n* Execute Unit Test via Fake/Mock\n\n\n\n### Grading Criteria\n\n\n\n* [15 pts]  Build badge and SonarQube links are updated and functional\n\n* [15 pts]  blob_file context manager is implemented correctly\n\n* [15 pts]  Library semantic versioning is implemented correctly\n\n* [15 pts]  Appropriate tests have been implemented to cover defined functions & classes\n\n* [10 pts]  Git commits are clear and well structured\n\n* [10 pts]  Code is pythonic, has no major SonarQube issues and well documented\n\n* [10 pts]  GitHub Action test is implemented for Python v3.7, v3.8 & v3.9\n\n* [10 pts]  Optional: Using release-please for release automation\n\n\n\n## Preface\n\n\n\n<!-- START doctoc generated TOC please keep comment here to allow auto update -->\n\n<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->\n\n**Table of Contents**  *generated with [DocToc](https://github.com/thlorenz/doctoc)*\n\n\n\n- [Library Versioning](#library-versioning)\n\n- [Required Modifications](#required-modifications)\n\n  - [Updating Setup.py](#updating-setuppy)\n\n  - [Complete GitHub Actions](#complete-github-actions)\n\n- [Installing library locally for development](#installing-library-locally-for-development)\n\n- [Utility implementations](#utility-implementations)\n\n  - [Context Manager to download a file from Azure blob](#context-manager-to-download-a-file-from-azure-blob)\n\n    - [Testing](#testing)\n\n- [Installing library in external application](#installing-library-in-external-application)\n\n  - [Install the utils](#install-the-utils)\n\n  - [Testing imported library in application](#testing-imported-library-in-application)\n\n    - [Updating](#updating)\n\n- [Recommended](#recommended)\n\n\n\n<!-- END doctoc generated TOC please keep comment here to allow auto update -->\n\n\n\n## Library Versioning\n\n\n\nFor a package/library, it is especially important that your branching workflow\n\nreflect the 'production' status of your library.\n\n\n\nAfter your initial tag commit of this repo, you must conform to a formal\n\ngit workflow.  This means:\n\n\n\n1. Your `main` branch should be ***merge-only***.  That is, never commit work to it\n\n   directly, only merge from `feature/*`, `develop/*`, `hotfix/*`, or similar.\n\n2. Each new merged commit on main must have a\n\n   [Semantic Versioning](https://semver.org/) release version with an\n\n   accompanying tag.  TL;DR:\n\n   * `major.minor.patch`\n\n   * Patch is for bugfix\n\n   * Minor is for new features\n\n   * Major is for backwards-incompatible changes\n\n   * Don't worry about high version numbers\n\n   * tags should be of the form `v0.1.2`\n\n3. [Optional] You can use [release-please](https://github.com/google-github-actions/release-please-action) to automate tagging and releasing your repository using [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/)\n\n\n\n## Required Modifications\n\n\n\nThe files provided in your problem-set repository require a few adjustments.\n\n\n\n### Updating Setup.py\n\n\n\n* Update lines 118-119 with url to your repository and your name.\n\n* Update line 121 (classifiers) to support 3.7, 3.8 and 3.9.\n\n* Update line 127 (python_requires) to support 3.7, 3.8 and 3.9.\n\n\n\nIf you are not familiar with [setup.py](setup.py) you may want to review [this overview](https://godatadriven.com/blog/a-practical-guide-to-using-setup-py/)  \n\n\n\nIn particular note that required libraries are installed via `install_requires` attribute in setup.py and optional libraries can be installed via `extras_requires`.  Review the `load_deps` function in setup.py to see how this is automaticaly managed.\n\n\n\n### Complete GitHub Actions\n\n\n\nThe PyPod created GitHub Actions have been replaced with a simplified project-build.yml file.\n\n\n\nAs part of this custom build file we would like to use\n\n[Github Action's Matrix](https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#jobsjob_idstrategymatrix) to test multiple installation of our library with multiple versions of Python (v3.7, v3.8 & v3.9) for completeness.\n\n\n\nMore information on using a strategy matrix can be found [here](https://ncorti.com/blog/howto-github-actions-build-matrix).\n\n\n\nHowever we only want SonarQube results stored for Python 3.7.  Implement the workflow\n\nso that it only submits code coverage/SonarQube reports if running with python 3.7 (there is no need to\n\nsubmit redundant code coverage for each matrix test run).\n\n\n\nCheck out [GitHub Workflow syntax](https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-syntax-for-github-actions) for\n\nadditional details if needed.\n\n\n\n## Installing library locally for development\n\n\n\nThe following 'editable' install links the package to the original location, basically \n\nmeaning any changes to the original package would reflect directly in your environment. \n\nIt is useful most often in the case when you are developing it on your system\n\n\n\nYou should never pip install anything else for a library; rather, dependencies\n\nshould be added into [setup.py](setup.py) and then `pip update` or similar.\n\nYou may, however, have to `pip install` _development_ specific requirements (or\n\nplace them as an `extras_requires` install).\n\n\n\nInstall editable library:\n\n\n\n```bash\n\n# General usage\n\npip install -e <project path or a VCS url>\n\n\n\n# Install the library you are working on (execute from where setup.py is at)\n\npip install -e .\n\n```\n\n\n\nInstall editable library with development requirements:\n\n\n\n```bash\n\npip install -e .[devel]\n\n```\n\nThe `[devel]` above comes from the `requirements-devel.txt` file. If you name your \n\ndevelopment requirements in other way (eg. `requirements-dev.txt`), then you should \n\nput other name in the square brackets (`[dev]` in the example).\n\n\n\n## Utility implementations\n\n### Context Manager to download a file from Azure blob\n\n\n\nImplement _ds_dev_utils.azure_utils.blob.blob_file_ as a context manager.\n\n\n\nWithin the context manager, download the specified blob file to a local\n\n[temporary directory](https://docs.python.org/3/library/tempfile.html#tempfile.TemporaryDirectory)\n\npreserving the name and extension of the file.\n\n\n\n```python\n\n...\n\ndef blob_file(\n\n    container_name: str,\n\n    blob_name: str,\n\n    sas_connection_str=\"\",\n\n    as_file=True,\n\n):\n\n    \"\"\"Context manager to downoad a specified blob file from a specified container using\n\n    a Shared-Access-Signature connection string.\n\n\n\n    Function will download blob file into a temporary directory and provide either the\n\n    File handle or File path to the calling context for reading.  After execution the \n\n    temporary file should be removed.\n\n\n\n    :param str container_name: name of the blob container\n\n    :param str blob_name: name of the blob file\n\n    :param str sas_connection_str: the shared access signature\n\n    :param bool as_file: If True the yielded object is :class:File.  Otherwise it will\n\n        be the file path as a string\n\n\n\n    Example Usage: \n\n    \n\n    with blob_file( ..., as_file=False ) as blob_path:\n\n        shutils.copyfile(blob_path, data_download_directory)\n\n\n\n    \"\"\"\n\n    ...\n\n```\n\n\n\nFor this class, you can create a blob storage account for testing in a personal Azure account\n\nor you can test in an existing P&G development blob account if you already have access to one.\n\n\n\nExpenses within personal Azure blob account should be insignificant (<1$) for the purposes of\n\nthis class.  If concerns, reach out to Instructor for help.\n\n\n\nThis [Azure blob quick-start tutorial](https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-python)\n\nmay be helpful in your implementation.\n\n\n\nFor permission management use the Azure portal to generate a [Shared Access Signature](https://adamtheautomator.com/azure-sas-token/#Generating_a_SAS_Token_using_the_Azure_Portal).\n\n\n\n***Note*** make sure you do not check your SAS Token into Git.\n\n\n\nMake sure to implement appropriate docstrings, unit tests and that your function implementation is at the right path.  \n\n\n\n#### Testing\n\n\n\nWhat should unit testing look like for a cloud-based API?  Should you hard-code credentials for testing (hint: no)?  \n\n\n\nWhat are the parts of the implementation that you have added - can you seperate\n\nthese out in a way so that you can test your code without being reliant on the Azure SDK code?\n\n\n\nHere is the azure_utils.mock.FakeBlobServiceClient implementation - can you use this with `@mock.patch` to enable improved unit testing?\n\n\n\n```python\n\nclass FakeBlobServiceClient:\n\n  \"\"\"Provies a Fake interface for azure.storage.blob.BlobServiceClient including\n\n  implementation for:\n\n  - BlobServiceClient.get_connection_str\n\n  - BlobServiceClient.get_container_client\n\n  - BlobContainerClient.get_blob_client\n\n  - Blob.read_all\n\n\n\n  e.g BlobServiceClient.from_connection_string(cnx_str). \\\n\n        get_container_client(cntr_name).get_blob_client(blob_name). \\\n\n        download_blob().readall()\n\n\n\n  \"\"\"\n\n    file_dict = {}\n\n\n\n    def __init__(self, file_dict=None):\n\n        self.file_dict = file_dict\n\n\n\n    def get_blob_client(self, blob, *args, **kwargs):\n\n\n\n        f_location = self.file_dict.get(blob, None)\n\n        f_dict = {\"blob_service_selected_file\": f_location}\n\n\n\n        return FakeBlobServiceClient(f_dict)\n\n\n\n    def download_blob(self, *args, **kwargs):\n\n        return self\n\n\n\n    def get_container_client(self, *args, **kwargs):\n\n        return self\n\n\n\n    def readall(self):\n\n        if len(self.file_dict) != 1:\n\n            raise FileNotFoundError(\"File not found\")\n\n\n\n        f = open(self.file_dict[\"blob_service_selected_file\"])\n\n        f_contents = f.read()\n\n        f.close()\n\n        return bytes(f_contents, \"utf-8\")\n\n```\n\n\n\n**Note:** Feel free to edit the FakeBlobServiceClient implementation if needed to\n\nsupport your specific implementation.\n\n\n\n## Installing library in external application\n\n\n\n### Install the utils\n\n\n\nYou can now install your `ds_dev_utils` as below.  Note that the #egg part is\n\nimportant, it is not a comment!\n\n\n\n```bash\n\npip install -e git+https://github.com/pg-ds-dev-class/ds-dev-utils-<GITHUBID>#egg=ds_dev_utils\n\n```\n\n\n\nThis will include the latest main commit (hopefully tagged) and will be\n\nautomatically updated whenever you run `pip update`.  If you want to be more\n\nspecific about the version, you can use the `@v1.2.3` syntax when you install.  \n\n\n\nLeaving this to automatically check out the latest main is easiest, and a good reason to have\n\nmerge-only main releases (you don't want to accidentally commit a \"break\" into main and\n\nhave all library-using applications no longer work!)\n\n\n\nYou will need to make sure that you have your [GitHub SSH access](https://docs.github.com/en/free-pro-team@latest/github/authenticating-to-github/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent) setup to enable installation\n\ndirectly from GitHub.\n\n\n\n### Testing imported library in application\n\n\n\nOnce library is installed, `pytest --pyargs ds_dev_utils` should run all the tests in your utils package.\n\n\n\nYou can run them by default within your application if you like, by adding `ds_dev_utils` to `testpaths`\n\nin the [config\n\nfile](https://docs.pytest.org/en/6.2.x/reference.html#confval-testpaths).\n\n\n\nThis is generally not needed as we would expect the library to execute it's own tests\n\nas part of CI/CD deployment.  However it never hurts to verify ;)\n\n\n\n#### Updating\n\n\n\nEvery time you push a new main version of library to GitHub, you may update the installed\n\nversion in your application via `pip update`.\n\n\n\n## Recommended\n\n\n\nThis content is not necessary to complete the pset, but you might want to go through for your self development.\n\n\n\n<details>\n\n<summary>Implementing Git-based versioning</summary>\n\nWe can check the library version in two ways:\n\n\n\n```bash\n\npython setup.py --version\n\npython -c \"import ds_dev_utils; print(ds_dev_utils.__version__)\"\n\n```\n\n\n\nBoth commands should print the same thing, which will look something like this:\n\n`0.0.1.dev4+gdfedba7.d20190209`.\n\n\n\nOnce you have your `setup.py` and `__init__.py` configured so the above works,\n\n***git tag*** your main `v0.1.0` (eg, `git tag v0.1.0`).  Now verify:\n\n\n\n```bash\n\npython setup.py --version  # Should print a clean semantic version\n\n```\n\n\n\nFrom now on, all commits on main branch [must have an accompanying semantic version tag](https://www.toolsqa.com/git/github-tags/).\n\n\n\nExample of adding a tag (typically done on main branch):\n\n\n\n```bash\n\ngit tag \"v1.1.1\"\n\ngit push --tags\n\n```\n\n\n\nWhen you later install this project into a problem set, if installed from a\n\nclean repo with a tag version, you'll get a nice version like `0.1.2`.  If,\n\nhowever, you inspect the `__version__` in your package from your git repo,\n\nwith non-committed edits, etc. you'll get an informative 'dirty' version number like\n\n`'0.2.1.dev0+g850a76d.d20180908'`. This is useful for debugging, building sphinx\n\ndocs in dev, etc, and you never have to specify a version except via tagging\n\nyour commit.\n\n\n\n</details>\n\n\n",
    "bugtrack_url": null,
    "license": "Proprietary",
    "summary": "test",
    "version": "0.1.0",
    "project_urls": {
        "Homepage": "https://github.com/pg-ds-dev-class/pset-2-dev-utils-borgohains-pg"
    },
    "split_keywords": [
        "p&g",
        "procter&gamble",
        "d&a",
        "data&analytics"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1f0a1401d3a9f2977b4078a890308ab99fb512902ce25f4e5ac16382fbfef5de",
                "md5": "eadab9ec08184ed167a41a9498f45908",
                "sha256": "bac771b8ba711894a325c3d8fdf90a062823c596c85d8933032e20eb3918d49e"
            },
            "downloads": -1,
            "filename": "pg-rd-cf-ai-pset-2-sandhan-b-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "eadab9ec08184ed167a41a9498f45908",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7, <=3.10",
            "size": 18621,
            "upload_time": "2024-01-18T14:28:34",
            "upload_time_iso_8601": "2024-01-18T14:28:34.120551Z",
            "url": "https://files.pythonhosted.org/packages/1f/0a/1401d3a9f2977b4078a890308ab99fb512902ce25f4e5ac16382fbfef5de/pg-rd-cf-ai-pset-2-sandhan-b-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-18 14:28:34",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "pg-ds-dev-class",
    "github_project": "pset-2-dev-utils-borgohains-pg",
    "github_not_found": true,
    "lcname": "pg-rd-cf-ai-pset-2-sandhan-b"
}
        
Elapsed time: 0.16547s