klp-commons


Nameklp-commons JSON
Version 0.0.69 PyPI version JSON
download
home_page
SummaryModulo Commons del ecosistema Kloop. Contiene los modulos de uso común para los paquetes
upload_time2023-07-04 18:26:05
maintainer
docs_urlNone
author
requires_python
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ## Commons 

Este repositorio de código se crea para implementar la microservicio `Commons` de la infraestructura de Klopp.

A continuación se proporciona una descripción de la estructura de los archivos y directorios más importantes:

## Template

- `setup.py`
- [`Notebook`]
- `test`
- `requirements.txt`
    - Blibliotecas necesarias para reproducir el entorno

## Estructura del proyecto

```
├── LICENSE
├── Makefile           <- Makefile with commands like `make data` or `make train`
├── README.md          <- The top-level README for developers using this project.
├── docs               <- A default Sphinx project; see sphinx-doc.org for details
├── models             <- Trained and serialized models, model predictions, or model summaries
├── experiments 
│   ├── notebooks      <- Jupyter notebooks. Naming convention is a number (for ordering),
│   │    └── mlflow    <- Metretrics and model management 
│   ├── references     <- Data dictionaries, manuals, and all other explanatory materials.
│   ├── processed      <- The final, canonical data sets for modeling. 
│   └── data  
│     ├── external       <- Data from third party sources.
│     ├── interim        <- Intermediate data that has been transformed.
│     ├── processed      <- The final, canonical data sets for modeling.
│     └── raw            <- The original, immutable data dump.
├── requirements.txt   <- The requirements file for reproducing the analysis environment, e.g.
│                         generated with `pip freeze > requirements.txt`
├── setup.py           <- Run this project 
├── pipeline           <- Source pipeline for load, preprocessing, training and test 
│   ├── __init__.py    <- Makes src a Python module
│   ├── data           <- Scripts to download or generate data
│   │   └── make_dataset.py
│   ├── features       <- Scripts to turn raw data into features for modeling
│   │   └── build_features.py
│   ├── models         <- Scripts to train models and then use trained models to make
│   │   │                 predictions
│   │   ├── predict_model.py
│   │   └── train_model.py
│   └── visualization  <- Scripts to create exploratory and results oriented visualizations
│       └── visualize.py
├── categorization     <- Source code for use in this project.
│   ├── __init__.py    <- Makes src a Python module
│   ├── categorization.py <- class and method run() for app running 
│   ├── classifier.py   <- Class for model ML
│   ├── consumer.py  <- class for Kafka consumer 
│   ├── controller_dynamo_db.py <- class for management CRUD 
│   ├── controller_ml_fow.py   <- Class for management models
│   ├── controller_posgrest_db.py  <- class for managemen CRUD  
│   ├── producer.py <- class for Kafka producer
│   ├── nicknames.py   <- Class 
│   ├── merchantnames.py  <- class 
│   └── logs       <- folder for logs files 
└── tox.ini            <- tox file with settings for running tox;(automate and standardize testing)
```


## Reproducir proyectos 

## Software necesario

El proyecto se desarrollo con los siguientes requisitos a primer nivel :

Python 3.10.4

Se recomienda a nivel de desarrollo utilizar un entorno virtual administrado por conda.

`conda create -n categorization python=3.10.4` 

Use sólo pip como gestor de paquetería después de crear en entorno virtual con conda.
Los requisitos de las bibliotecas necesarias se pueden pasar a pip a través del archivo `requiremets.txt`

pip install -r requirements.txt

Ver pagína de [python](https://requirements-txt.readthedocs.io/en/latest/#:~:text=txt%20installing%20them%20using%20pip.&text=The%20installation%20process%20include%20only,That's%20it.&text=Customize%20it%20the%20way%20you,allow%20or%20disallow%20automated%20requirements)


Otra opcíon es utilizar un docker oficial de python con la versión cómo  3.10 como mínima. Esta es sólo si utilizas Linux o Windows como sistema operativo, existe problemas de compatibilidad para MacBooks M1


[Docker Hub de Python](https://hub.docker.com/_/python)

- Para el entorno local se utiliza [Jupyer Notebook] como entorno de experimentación
- Para administrar los modelos de ML se utiliza [MLFlow]() con Posgrestdb
- Como gestor de bases de datos relacional se utiliza PosgrestDB
- Para almacenar información no estructurada se utiliza DynamoDB
- Para versionamiento de los dataset se utiliza [DVC]
- Para autoformatting se utilizan los paquetes [`Back`](), [Flake8]()  y [autopep8] () 
- Para pruebas unitarias se utiliza el paquete estándar de python `unittest` 

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "klp-commons",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "",
    "author_email": "Omar Santa Cruz <omar.castellanos@kolpp.mx>",
    "download_url": "https://files.pythonhosted.org/packages/3d/af/a42bb293948c482f4ceb2eee99cda0a9e01dd44190c0232422fc148236e0/klp_commons-0.0.69.tar.gz",
    "platform": null,
    "description": "## Commons \n\nEste repositorio de c\u00f3digo se crea para implementar la microservicio `Commons` de la infraestructura de Klopp.\n\nA continuaci\u00f3n se proporciona una descripci\u00f3n de la estructura de los archivos y directorios m\u00e1s importantes:\n\n## Template\n\n- `setup.py`\n- [`Notebook`]\n- `test`\n- `requirements.txt`\n    - Blibliotecas necesarias para reproducir el entorno\n\n## Estructura del proyecto\n\n```\n\u251c\u2500\u2500 LICENSE\n\u251c\u2500\u2500 Makefile           <- Makefile with commands like `make data` or `make train`\n\u251c\u2500\u2500 README.md          <- The top-level README for developers using this project.\n\u251c\u2500\u2500 docs               <- A default Sphinx project; see sphinx-doc.org for details\n\u251c\u2500\u2500 models             <- Trained and serialized models, model predictions, or model summaries\n\u251c\u2500\u2500 experiments \n\u2502   \u251c\u2500\u2500 notebooks      <- Jupyter notebooks. Naming convention is a number (for ordering),\n\u2502   \u2502    \u2514\u2500\u2500 mlflow    <- Metretrics and model management \n\u2502   \u251c\u2500\u2500 references     <- Data dictionaries, manuals, and all other explanatory materials.\n\u2502   \u251c\u2500\u2500 processed      <- The final, canonical data sets for modeling. \n\u2502   \u2514\u2500\u2500 data  \n\u2502     \u251c\u2500\u2500 external       <- Data from third party sources.\n\u2502     \u251c\u2500\u2500 interim        <- Intermediate data that has been transformed.\n\u2502     \u251c\u2500\u2500 processed      <- The final, canonical data sets for modeling.\n\u2502     \u2514\u2500\u2500 raw            <- The original, immutable data dump.\n\u251c\u2500\u2500 requirements.txt   <- The requirements file for reproducing the analysis environment, e.g.\n\u2502                         generated with `pip freeze > requirements.txt`\n\u251c\u2500\u2500 setup.py           <- Run this project \n\u251c\u2500\u2500 pipeline           <- Source pipeline for load, preprocessing, training and test \n\u2502   \u251c\u2500\u2500 __init__.py    <- Makes src a Python module\n\u2502   \u251c\u2500\u2500 data           <- Scripts to download or generate data\n\u2502   \u2502   \u2514\u2500\u2500 make_dataset.py\n\u2502   \u251c\u2500\u2500 features       <- Scripts to turn raw data into features for modeling\n\u2502   \u2502   \u2514\u2500\u2500 build_features.py\n\u2502   \u251c\u2500\u2500 models         <- Scripts to train models and then use trained models to make\n\u2502   \u2502   \u2502                 predictions\n\u2502   \u2502   \u251c\u2500\u2500 predict_model.py\n\u2502   \u2502   \u2514\u2500\u2500 train_model.py\n\u2502   \u2514\u2500\u2500 visualization  <- Scripts to create exploratory and results oriented visualizations\n\u2502       \u2514\u2500\u2500 visualize.py\n\u251c\u2500\u2500 categorization     <- Source code for use in this project.\n\u2502   \u251c\u2500\u2500 __init__.py    <- Makes src a Python module\n\u2502   \u251c\u2500\u2500 categorization.py <- class and method run() for app running \n\u2502   \u251c\u2500\u2500 classifier.py   <- Class for model ML\n\u2502   \u251c\u2500\u2500 consumer.py  <- class for Kafka consumer \n\u2502   \u251c\u2500\u2500 controller_dynamo_db.py <- class for management CRUD \n\u2502   \u251c\u2500\u2500 controller_ml_fow.py   <- Class for management models\n\u2502   \u251c\u2500\u2500 controller_posgrest_db.py  <- class for managemen CRUD  \n\u2502   \u251c\u2500\u2500 producer.py <- class for Kafka producer\n\u2502   \u251c\u2500\u2500 nicknames.py   <- Class \n\u2502   \u251c\u2500\u2500 merchantnames.py  <- class \n\u2502   \u2514\u2500\u2500 logs       <- folder for logs files \n\u2514\u2500\u2500 tox.ini            <- tox file with settings for running tox;(automate and standardize testing)\n```\n\n\n## Reproducir proyectos \n\n## Software necesario\n\nEl proyecto se desarrollo con los siguientes requisitos a primer nivel :\n\nPython 3.10.4\n\nSe recomienda a nivel de desarrollo utilizar un entorno virtual administrado por conda.\n\n`conda create -n categorization python=3.10.4` \n\nUse s\u00f3lo pip como gestor de paqueter\u00eda despu\u00e9s de crear en entorno virtual con conda.\nLos requisitos de las bibliotecas necesarias se pueden pasar a pip a trav\u00e9s del archivo `requiremets.txt`\n\npip install -r requirements.txt\n\nVer pag\u00edna de [python](https://requirements-txt.readthedocs.io/en/latest/#:~:text=txt%20installing%20them%20using%20pip.&text=The%20installation%20process%20include%20only,That's%20it.&text=Customize%20it%20the%20way%20you,allow%20or%20disallow%20automated%20requirements)\n\n\nOtra opc\u00edon es utilizar un docker oficial de python con la versi\u00f3n c\u00f3mo  3.10 como m\u00ednima. Esta es s\u00f3lo si utilizas Linux o Windows como sistema operativo, existe problemas de compatibilidad para MacBooks M1\n\n\n[Docker Hub de Python](https://hub.docker.com/_/python)\n\n- Para el entorno local se utiliza [Jupyer Notebook] como entorno de experimentaci\u00f3n\n- Para administrar los modelos de ML se utiliza [MLFlow]() con Posgrestdb\n- Como gestor de bases de datos relacional se utiliza PosgrestDB\n- Para almacenar informaci\u00f3n no estructurada se utiliza DynamoDB\n- Para versionamiento de los dataset se utiliza [DVC]\n- Para autoformatting se utilizan los paquetes [`Back`](), [Flake8]()  y [autopep8] () \n- Para pruebas unitarias se utiliza el paquete est\u00e1ndar de python `unittest` \n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Modulo Commons del ecosistema Kloop. Contiene los modulos de uso com\u00fan para los paquetes",
    "version": "0.0.69",
    "project_urls": {
        "Home": "https://gitlab.com/klopp1/backend/microservices/commons"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f531af4accd29aa2ef5c339e642d81702f3b7007892a9658a8f5946813cc47b6",
                "md5": "3345863afa4cbe172d309b8f90f174ab",
                "sha256": "bfe55e6776cd1a40f4050b91fcb93a6e1b2b5f33fb64bcda0ffb665d3cf43f0e"
            },
            "downloads": -1,
            "filename": "klp_commons-0.0.69-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3345863afa4cbe172d309b8f90f174ab",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": null,
            "size": 437138,
            "upload_time": "2023-07-04T18:26:03",
            "upload_time_iso_8601": "2023-07-04T18:26:03.335454Z",
            "url": "https://files.pythonhosted.org/packages/f5/31/af4accd29aa2ef5c339e642d81702f3b7007892a9658a8f5946813cc47b6/klp_commons-0.0.69-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3dafa42bb293948c482f4ceb2eee99cda0a9e01dd44190c0232422fc148236e0",
                "md5": "793a81b6f85f624899aa70b18b3c560c",
                "sha256": "33ce0eb7f5e07ae9a668b482f93c18d70522f0e3cb789fefc9ead87d03a24710"
            },
            "downloads": -1,
            "filename": "klp_commons-0.0.69.tar.gz",
            "has_sig": false,
            "md5_digest": "793a81b6f85f624899aa70b18b3c560c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 413505,
            "upload_time": "2023-07-04T18:26:05",
            "upload_time_iso_8601": "2023-07-04T18:26:05.971632Z",
            "url": "https://files.pythonhosted.org/packages/3d/af/a42bb293948c482f4ceb2eee99cda0a9e01dd44190c0232422fc148236e0/klp_commons-0.0.69.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-07-04 18:26:05",
    "github": false,
    "gitlab": true,
    "bitbucket": false,
    "codeberg": false,
    "gitlab_user": "klopp1",
    "gitlab_project": "backend",
    "lcname": "klp-commons"
}
        
Elapsed time: 0.09464s