pypigeonhole-build


Namepypigeonhole-build JSON
Version 0.4.1 PyPI version JSON
download
home_pagehttps://github.com/psilons/pypigeonhole-build
SummaryPython build & packaging tool
upload_time2020-10-22 04:59:52
maintainer
docs_urlNone
authorpsilons
requires_python>=3.6
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Python Build Tools

![Python Package using Conda](https://github.com/psilons/pypigeonhole-build/workflows/Python%20Package%20using%20Conda/badge.svg)
![Test Coverage](https://raw.githubusercontent.com/psilons/pypigeonhole-build/master/coverage.svg)
[![PyPI version](https://badge.fury.io/py/pypigeonhole-build.svg)](https://badge.fury.io/py/pypigeonhole-build)
![Anaconda version](https://anaconda.org/psilons/pypigeonhole-build/badges/version.svg)
![Anaconda_platform](https://anaconda.org/psilons/pypigeonhole-build/badges/platforms.svg)
![License](https://anaconda.org/psilons/pypigeonhole-build/badges/license.svg)


This is a Python SDLC tool to shorten the time we spend on SDLC without
sacrificing quality. It does so by hard-coding certain flexible parts
(convention over configuration). 
Flexibility could lead to confusion and low efficiency because there is
no standard and thus we cannot build tools on top of that to improve
efficiency. 

This tool is built on top of Conda, PIP and GIT.

Specifically, we tackle the following areas:
- dependency management: create central structure and populate information to 
  setup.py and requirements.txt/environment.yml.
- version management: tag GIT code with the current version and then bump the 
  version (save back to GIT too).
- identify the key steps in terms of scripts. These scripts' functionalities 
  are important abstractions. The implementation can be altered if needed,
  e.g., our unittest script is unittest based, you may have a pytest version.

A good example for efficiency is Java's mature tool, 
[Maven](http://maven.apache.org/).


## Goals

- set up a standard project structure. 
- create reusable tools to minimize the necessary work for dependency 
  management and CI. 
- Make routine steps efficient.
- Make out-of-routine steps not more painful, i.e., our code should not add 
  more hassle when you extend/modify it.


## Standard SDLC Process Acceleration

After the initial project setup, the process has the following steps,
with the script ```pphsdlc.sh or pphsdlc.bat``` (We use the bash name below
for simplicity):
- setup: create conda environment specified in dep_setup.py
- test: run unit tests and collect coverage
- package: package artifact with pip | conda | zip
- upload: upload to pip | piptest | conda
- release: tag the current version in git and then bump the version
- cleanup: cleanup intermediate results in filesystem
- help or without any parameter: this menu

These 6 steps (minus help) should be enough for most projects (excluding 
integration testing/etc), and they are simple steps, as simple as Maven.


## Project Setup

Download miniconda, if needed. Then install pypigeonhole-build to the base
environment

```conda install -c psilons pypigeonhole-build```

It's the jump start of the process - create other conda environments specified 
in the dep_setup.py. It installs its scripts in the base env, prefixed by pph_ . 
The interface is ```pphsdlc.sh``` with the above 6 options. This script should run 
in the project folder and in the conda env, except the first step (setup env).

Next, use your favorite IDE to create the Python project, these are one time 
setup:
- The project name is hyphen separated, e.g., pypigeonhole-build.
- create src and test folders under the project. These 2 names are 
  hardcoded in the scripts. 
  >We don't think the naming freedom of these 2 folders help us anything.

  >We want to separate src and test completely, not one inside another.

- under src create the top package folder, it's the project name with "-"
  replaced by "_" . In this case, it's pypigeonhole_build. Since the top 
  package has to be globally unique, choose it wisely. This top package name 
  is also part of the conda env name.
- copy app_setup.py and dep_setup.py from here to the top package, and 
  modify them:
    - modify the version number in app_setup.py: __app_version. This 
      variable name is hardcoded in the version bumping script. You may 
      choose a different bumping strategy. 
    - modify the settings and add dependencies in the marked region in
      dep_setup.py. Each dependency has the following fields:
        - name: required. If name == python, the "python_requires" field in the 
          setup.py will be touched.
        - version: default to latest. Need full format: '==2.5'
        - scope: default to DEV, could be DEV/INSTALL. INSTALL dependencies show in the
          "install_requires" field. DEV dependencies show up in the "test_require" 
          field.
        - installer: default to PIP, could be PIP/CONDA. Extendable to other 
          installers.
        - url: this is for github+https.

      If Conda is used, need to set the CONDA.env field, which is mapped to the first
      line in the environment.yml. You may overwrite the default. If you 
      have extra channels here, make sure you add them to the conda config:

      ```conda config --add channels new_channel```

      Otherwise, the conda-build would fail later on.
- copy the setup.py to the project root, change the imports around line 
  14/15 to match your top package. 

These are the minimal information we need in order to carry out the SDLC 
process. 
  - the information is required, such as name and version.
  - they can be overwritten or extended.

## SDLC Process

- Now we set up the conda env: ```pphsdlc.sh setup```  
  At the end of the run, it prints out which env it creates and you just
  activate that. If you run into this issue on windows, just rerun the
  script (Maybe the IDE locks this, just a wild guess):  
  >ERROR conda.core.link:_execute(698): An error occurred while installing 
  package 'defaults::vs2015_runtime-14.16.27012-hf0eaf9b_3'. Rolling back 
  transaction: ...working... done   
  [Errno 13] Permission denied: 'D:\\0dev\\miniconda3\\envs\\py390_pypigeonhole_build\\vcruntime140.dll'    
  ()     

  >The existing conda environment with same env name will be deleted, and a 
  new environment will be created. 

  >requirements.txt and environment.yaml are generated as well. Whenever 
  we change dependencies, we need to rerun this script to re-generate the 
  files. Conda uses environment.yml, without "a" in yaml. However, Github 
  action uses environment.yaml, with "a" in yaml.

- Point IDE to this cond env, we could start coding now.
- Once the code is done, we could run the unit test: ```pphsdlc.sh test```.
  All scripts need to run from project root and in the conda env. This step
  generates test coverage report and coverage badge.

  >In order to run test from the project root folder, we add a src reference in
  the __init__.py under test top package. Otherwise, tests can run only from
  the src folder.

- If test coverage is good, we can pack the project, with pip, conda, or zip.
    - ```pphsdlc.sh package pip``` the target folder is printed at the end. It
      takes another parameter from .pypirc, such as testpypi. The default is
      pypi.
    - ```pphsdlc.sh package conda``` we need to create the conda package scripts
      first. The location bbin/pkg_conda_cfg is hardcoded in the script. There
      are 3 files under this folder need to be filled in. Check conda-build
      document for this: https://docs.conda.io/projects/conda-build/en/latest/.
      >There is a bug in conda-build for windows. After the build, the conda
      envs are mislabeled. So close the window and open a new one. Conda build
      on linux is just fine. 

      >One of the reasons we like conda is because it could bundle other files.
      It's more handy than zip because we use it as a transporter as well.

      The default upload server is conda central server. To use another server,
      check anaconda documents. To use a local file system, there is an env
      variable CONDA_UPLOAD_CHANNEL you can set. If this is empty, it uploads
      files to conda central. If this is set, e.g.,

      ```set CONDA_UPLOAD_CHANNEL=file:///D:\repo\channel```

      it copies .tar.bz2 files to here and run indexing.
    - ```pphsdlc.sh package zip```: This will zip 3 sub-folders under the project
      root, bin for scripts, conf for configurations, and dist for other 
      compiled results, such as executables. Since this is a customized way, 
      there is no associated upload tool and users need to deal with uploading 
      by themselves. However, tar + zip is a universal tool across different 
      OS.
  If we have C/C++ compiling involved, we suggest that we save the result in
  dist folder for consistence. Then in bbin/pkg_conda_cfg/build.sh, we may
  bundle them. This is true for C/C++, PyInstaller, or Cython. We standardize
  the output folders for packagers: pip uses dist, conda uses dist_conda, 
  and zip uses dist_zip. 

- Now it's time to run some local sanity checks on the new packages. Install
  these packages in local.
- To upload packages to central servers, run
    - ```pphsdlc.sh upload pip```: This uploads to PIP servers. You may 
      redirect the servers in settings.
    - ```pphsdlc.sh upload conda <package>```: This upload to conda repo. You
      may redirect this too.

- Now we run ```pphsdlc.sh release``` to tag the version in GIT and then bump
  up the version. PIP or conda do not have the concept snapshot builds as
  in Maven, so we cannot overwrite versions. This step helps us manage the
  versions.

  >We use major.minor.patch format in versions. The minor and patch 
  increments are bounded by 100 by default. This can be overwritten in
  app_setup.py.

  >Check in changes first before running this script.

- clean up step is optional, ```pphsdlc.sh cleanup``` deletes all build folders.
  Make sure you are not inside those folders.

## Side Notes and Future improvements

For install/deploy: 
  - lib installation: use pip and/or conda.
  - app deployment: conda can bundle scripts and Python code. So we use conda
    as the transport to deploy apps to conda environments. 
    >There are many other ways, ground or cloud, to deploy apps, such as 
    kubernetes, Ansible, etc. We leave these out due to high possible 
    customizations (i.e., no predictable patterns).

    >For applications, we hard code "bin" folder for start-up and other 
    scripts. We hard code "conf" folder for configurations.

For any run on windows, we use ```<script> 2>&1 | tee my.log``` to save the 
log to local file, since some commands clear command window screen, and so we 
lose screen prints.

sample project is in a separate repo: 
[Project Template](https://github.com/psilons/pypigeonhole-proj-tmplt).
In fact, we set up this project in the same way mentioned here too.

If these tools are not suitable, you just create other scripts local to the
project you work on. The existing scripts / Python code should not interfere
the overwritten/extension.

Future considerations:
- package_data in setup.py is not supported (yet).
- dependency information is not populated to meta.yaml, used by conda-build.
- Need a network storage to store build/test results with http access for CI.



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/psilons/pypigeonhole-build",
    "name": "pypigeonhole-build",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "",
    "author": "psilons",
    "author_email": "psilons.quanta@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/b3/ea/1d20d94d9eac36e90f17570ea457603c3080158648dddd945ec85ebcc861/pypigeonhole-build-0.4.1.tar.gz",
    "platform": "",
    "description": "# Python Build Tools\n\n![Python Package using Conda](https://github.com/psilons/pypigeonhole-build/workflows/Python%20Package%20using%20Conda/badge.svg)\n![Test Coverage](https://raw.githubusercontent.com/psilons/pypigeonhole-build/master/coverage.svg)\n[![PyPI version](https://badge.fury.io/py/pypigeonhole-build.svg)](https://badge.fury.io/py/pypigeonhole-build)\n![Anaconda version](https://anaconda.org/psilons/pypigeonhole-build/badges/version.svg)\n![Anaconda_platform](https://anaconda.org/psilons/pypigeonhole-build/badges/platforms.svg)\n![License](https://anaconda.org/psilons/pypigeonhole-build/badges/license.svg)\n\n\nThis is a Python SDLC tool to shorten the time we spend on SDLC without\nsacrificing quality. It does so by hard-coding certain flexible parts\n(convention over configuration). \nFlexibility could lead to confusion and low efficiency because there is\nno standard and thus we cannot build tools on top of that to improve\nefficiency. \n\nThis tool is built on top of Conda, PIP and GIT.\n\nSpecifically, we tackle the following areas:\n- dependency management: create central structure and populate information to \n  setup.py and requirements.txt/environment.yml.\n- version management: tag GIT code with the current version and then bump the \n  version (save back to GIT too).\n- identify the key steps in terms of scripts. These scripts' functionalities \n  are important abstractions. The implementation can be altered if needed,\n  e.g., our unittest script is unittest based, you may have a pytest version.\n\nA good example for efficiency is Java's mature tool, \n[Maven](http://maven.apache.org/).\n\n\n## Goals\n\n- set up a standard project structure. \n- create reusable tools to minimize the necessary work for dependency \n  management and CI. \n- Make routine steps efficient.\n- Make out-of-routine steps not more painful, i.e., our code should not add \n  more hassle when you extend/modify it.\n\n\n## Standard SDLC Process Acceleration\n\nAfter the initial project setup, the process has the following steps,\nwith the script ```pphsdlc.sh or pphsdlc.bat``` (We use the bash name below\nfor simplicity):\n- setup: create conda environment specified in dep_setup.py\n- test: run unit tests and collect coverage\n- package: package artifact with pip | conda | zip\n- upload: upload to pip | piptest | conda\n- release: tag the current version in git and then bump the version\n- cleanup: cleanup intermediate results in filesystem\n- help or without any parameter: this menu\n\nThese 6 steps (minus help) should be enough for most projects (excluding \nintegration testing/etc), and they are simple steps, as simple as Maven.\n\n\n## Project Setup\n\nDownload miniconda, if needed. Then install pypigeonhole-build to the base\nenvironment\n\n```conda install -c psilons pypigeonhole-build```\n\nIt's the jump start of the process - create other conda environments specified \nin the dep_setup.py. It installs its scripts in the base env, prefixed by pph_ . \nThe interface is ```pphsdlc.sh``` with the above 6 options. This script should run \nin the project folder and in the conda env, except the first step (setup env).\n\nNext, use your favorite IDE to create the Python project, these are one time \nsetup:\n- The project name is hyphen separated, e.g., pypigeonhole-build.\n- create src and test folders under the project. These 2 names are \n  hardcoded in the scripts. \n  >We don't think the naming freedom of these 2 folders help us anything.\n\n  >We want to separate src and test completely, not one inside another.\n\n- under src create the top package folder, it's the project name with \"-\"\n  replaced by \"_\" . In this case, it's pypigeonhole_build. Since the top \n  package has to be globally unique, choose it wisely. This top package name \n  is also part of the conda env name.\n- copy app_setup.py and dep_setup.py from here to the top package, and \n  modify them:\n    - modify the version number in app_setup.py: __app_version. This \n      variable name is hardcoded in the version bumping script. You may \n      choose a different bumping strategy. \n    - modify the settings and add dependencies in the marked region in\n      dep_setup.py. Each dependency has the following fields:\n        - name: required. If name == python, the \"python_requires\" field in the \n          setup.py will be touched.\n        - version: default to latest. Need full format: '==2.5'\n        - scope: default to DEV, could be DEV/INSTALL. INSTALL dependencies show in the\n          \"install_requires\" field. DEV dependencies show up in the \"test_require\" \n          field.\n        - installer: default to PIP, could be PIP/CONDA. Extendable to other \n          installers.\n        - url: this is for github+https.\n\n      If Conda is used, need to set the CONDA.env field, which is mapped to the first\n      line in the environment.yml. You may overwrite the default. If you \n      have extra channels here, make sure you add them to the conda config:\n\n      ```conda config --add channels new_channel```\n\n      Otherwise, the conda-build would fail later on.\n- copy the setup.py to the project root, change the imports around line \n  14/15 to match your top package. \n\nThese are the minimal information we need in order to carry out the SDLC \nprocess. \n  - the information is required, such as name and version.\n  - they can be overwritten or extended.\n\n## SDLC Process\n\n- Now we set up the conda env: ```pphsdlc.sh setup```  \n  At the end of the run, it prints out which env it creates and you just\n  activate that. If you run into this issue on windows, just rerun the\n  script (Maybe the IDE locks this, just a wild guess):  \n  >ERROR conda.core.link:_execute(698): An error occurred while installing \n  package 'defaults::vs2015_runtime-14.16.27012-hf0eaf9b_3'. Rolling back \n  transaction: ...working... done   \n  [Errno 13] Permission denied: 'D:\\\\0dev\\\\miniconda3\\\\envs\\\\py390_pypigeonhole_build\\\\vcruntime140.dll'    \n  ()     \n\n  >The existing conda environment with same env name will be deleted, and a \n  new environment will be created. \n\n  >requirements.txt and environment.yaml are generated as well. Whenever \n  we change dependencies, we need to rerun this script to re-generate the \n  files. Conda uses environment.yml, without \"a\" in yaml. However, Github \n  action uses environment.yaml, with \"a\" in yaml.\n\n- Point IDE to this cond env, we could start coding now.\n- Once the code is done, we could run the unit test: ```pphsdlc.sh test```.\n  All scripts need to run from project root and in the conda env. This step\n  generates test coverage report and coverage badge.\n\n  >In order to run test from the project root folder, we add a src reference in\n  the __init__.py under test top package. Otherwise, tests can run only from\n  the src folder.\n\n- If test coverage is good, we can pack the project, with pip, conda, or zip.\n    - ```pphsdlc.sh package pip``` the target folder is printed at the end. It\n      takes another parameter from .pypirc, such as testpypi. The default is\n      pypi.\n    - ```pphsdlc.sh package conda``` we need to create the conda package scripts\n      first. The location bbin/pkg_conda_cfg is hardcoded in the script. There\n      are 3 files under this folder need to be filled in. Check conda-build\n      document for this: https://docs.conda.io/projects/conda-build/en/latest/.\n      >There is a bug in conda-build for windows. After the build, the conda\n      envs are mislabeled. So close the window and open a new one. Conda build\n      on linux is just fine. \n\n      >One of the reasons we like conda is because it could bundle other files.\n      It's more handy than zip because we use it as a transporter as well.\n\n      The default upload server is conda central server. To use another server,\n      check anaconda documents. To use a local file system, there is an env\n      variable CONDA_UPLOAD_CHANNEL you can set. If this is empty, it uploads\n      files to conda central. If this is set, e.g.,\n\n      ```set CONDA_UPLOAD_CHANNEL=file:///D:\\repo\\channel```\n\n      it copies .tar.bz2 files to here and run indexing.\n    - ```pphsdlc.sh package zip```: This will zip 3 sub-folders under the project\n      root, bin for scripts, conf for configurations, and dist for other \n      compiled results, such as executables. Since this is a customized way, \n      there is no associated upload tool and users need to deal with uploading \n      by themselves. However, tar + zip is a universal tool across different \n      OS.\n  If we have C/C++ compiling involved, we suggest that we save the result in\n  dist folder for consistence. Then in bbin/pkg_conda_cfg/build.sh, we may\n  bundle them. This is true for C/C++, PyInstaller, or Cython. We standardize\n  the output folders for packagers: pip uses dist, conda uses dist_conda, \n  and zip uses dist_zip. \n\n- Now it's time to run some local sanity checks on the new packages. Install\n  these packages in local.\n- To upload packages to central servers, run\n    - ```pphsdlc.sh upload pip```: This uploads to PIP servers. You may \n      redirect the servers in settings.\n    - ```pphsdlc.sh upload conda <package>```: This upload to conda repo. You\n      may redirect this too.\n\n- Now we run ```pphsdlc.sh release``` to tag the version in GIT and then bump\n  up the version. PIP or conda do not have the concept snapshot builds as\n  in Maven, so we cannot overwrite versions. This step helps us manage the\n  versions.\n\n  >We use major.minor.patch format in versions. The minor and patch \n  increments are bounded by 100 by default. This can be overwritten in\n  app_setup.py.\n\n  >Check in changes first before running this script.\n\n- clean up step is optional, ```pphsdlc.sh cleanup``` deletes all build folders.\n  Make sure you are not inside those folders.\n\n## Side Notes and Future improvements\n\nFor install/deploy: \n  - lib installation: use pip and/or conda.\n  - app deployment: conda can bundle scripts and Python code. So we use conda\n    as the transport to deploy apps to conda environments. \n    >There are many other ways, ground or cloud, to deploy apps, such as \n    kubernetes, Ansible, etc. We leave these out due to high possible \n    customizations (i.e., no predictable patterns).\n\n    >For applications, we hard code \"bin\" folder for start-up and other \n    scripts. We hard code \"conf\" folder for configurations.\n\nFor any run on windows, we use ```<script> 2>&1 | tee my.log``` to save the \nlog to local file, since some commands clear command window screen, and so we \nlose screen prints.\n\nsample project is in a separate repo: \n[Project Template](https://github.com/psilons/pypigeonhole-proj-tmplt).\nIn fact, we set up this project in the same way mentioned here too.\n\nIf these tools are not suitable, you just create other scripts local to the\nproject you work on. The existing scripts / Python code should not interfere\nthe overwritten/extension.\n\nFuture considerations:\n- package_data in setup.py is not supported (yet).\n- dependency information is not populated to meta.yaml, used by conda-build.\n- Need a network storage to store build/test results with http access for CI.\n\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Python build & packaging tool",
    "version": "0.4.1",
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "md5": "4ed145e5a4b4fcca0d3fdb82aab93bcf",
                "sha256": "0a0ea7cabff806b2024f7d125b96dd7d6529cfda26c48620f1bc1196c3a8ea6f"
            },
            "downloads": -1,
            "filename": "pypigeonhole_build-0.4.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4ed145e5a4b4fcca0d3fdb82aab93bcf",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 11273,
            "upload_time": "2020-10-22T04:59:50",
            "upload_time_iso_8601": "2020-10-22T04:59:50.974144Z",
            "url": "https://files.pythonhosted.org/packages/c8/1e/43ee7e3f64d2fef57e9ab03947ca2ee15f62fd54c4fa42ba499d2984c20e/pypigeonhole_build-0.4.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "md5": "cbc7e139a2f7ca719de482368100a664",
                "sha256": "1d3fca0da3318b713f3009c74c69b850dd176eaf9f6f9e4708b8d9666bc159ac"
            },
            "downloads": -1,
            "filename": "pypigeonhole-build-0.4.1.tar.gz",
            "has_sig": false,
            "md5_digest": "cbc7e139a2f7ca719de482368100a664",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 14634,
            "upload_time": "2020-10-22T04:59:52",
            "upload_time_iso_8601": "2020-10-22T04:59:52.045039Z",
            "url": "https://files.pythonhosted.org/packages/b3/ea/1d20d94d9eac36e90f17570ea457603c3080158648dddd945ec85ebcc861/pypigeonhole-build-0.4.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2020-10-22 04:59:52",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": null,
    "github_project": "psilons",
    "error": "Could not fetch GitHub repository",
    "lcname": "pypigeonhole-build"
}
        
Elapsed time: 0.19482s