data-copilot


Namedata-copilot JSON
Version 0.1.2 PyPI version JSON
download
home_page
SummaryThe data_copilot web application
upload_time2023-06-13 07:12:10
maintainer
docs_urlNone
author
requires_python>=3.10
license
keywords data_copilot
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            .. image:: https://raw.githubusercontent.com/Modulos/data_copilot/main/data_copilot/frontend/src/assets/icons/svg/logo-landing.svg
   :target: #
   :align: center
   :width: 100%

--------


.. image:: https://pyup.io/repos/github/Modulos/data_copilot/shield.svg
     :target: https://pyup.io/repos/github/Modulos/data_copilot/
     :alt: Updates

.. image:: https://img.shields.io/badge/python-3.10-blue
     :target: #
     :alt: Python Version 3.10

.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
     :target: https://github.com/psf/black

.. image:: https://github.com/Modulos/data_copilot/actions/workflows/pr-test.yml/badge.svg?event=push
     :target: https://github.com/Modulos/data_copilot/actions/workflows/pr-test.yml

.. image:: https://img.shields.io/discord/1110586557963972618?color=%237289DA&label=DataCopilot&logo=discord&logoColor=white
   :target: https://discord.gg/muerW29z

--------

πŸš€ Welcome to Data Copilot!
===========================

Data Copilot is not just another data analysis software. It is an end-to-end, scalable, and docker-based solution engineered to revolutionize the way you engage with data. As a comprehensive platform, it marries frontend, backend, and execution functionalities into a seamless user experience. Whether you're dealing with CSV or XLSX files, simply upload your data and start asking questions. With Data Copilot, you are not just analyzing data, you're conversing with it. It goes beyond being a mere tool - it's your co-pilot on the journey to unlock meaningful insights from complex data. What's more? It's a framework that allows you to build your own prompt-based applications, adding an extra dimension to user interaction. And with exciting updates on the horizon, the possibilities are limitless.

Here's what makes Data Copilot your go-to data analysis companion:

- πŸ“Š Streamlined Data Analysis: Designed to streamline data analysis, making it more efficient and accessible.
- 🚒 Docker-Based: Leverage the power of containerization to ensure scalability and easy deployment.
- πŸ“‘ Multi-Format Support: From CSV to XLSX, upload files in various formats and interactively analyze them, easily extending to other formats.
- πŸ’¬ Interactive Querying: Transform complex data into understandable insights through interactive queries.
- πŸ› οΈ Customizable Framework: A robust platform that lets you build your own prompt-based applications for enhanced user experience.
- πŸ“ˆ Future-Proof: Stay tuned for future updates that promise to further enhance its versatility and utility in data management.

πŸ”‘ Prerequisites
================

Before you can install Data Copilot, you must have an OpenAI API key. You can get one by signing up for an account at `openai.com <https://beta.openai.com/signup>`_. Once you have an API key, you can proceed with the installation.


🐳 Installation (with Docker)
=============================

Before you can install Data Copilot, you need to make sure you have the following tools installed:

- `Docker <https://docs.docker.com/get-docker/>`_
- `Docker Compose <https://docs.docker.com/compose/install/>`_
- `Python3 <https://www.python.org/downloads/>`_

Each of these tools has its own installation guide. Follow the links to get instructions for your specific operating system (Windows, Mac, or Linux).

**Cloning and Setting Up**

Once you have Docker, Docker Compose, and Python3 installed, you can download and set up Data Copilot. Run the following commands in your terminal:

.. code-block:: bash

    git clone https://github.com/modulos/data_copilot.git
    cd data_copilot
    pip install ".[dev]"
    make setup

**Open Data Copilot in your browser: http://localhost:80**


These commands will clone the Data Copilot repository and run the setup process.

During the setup process, you will be prompted to enter your openai API key. You can also enter it manually by editing the ``.dev.env`` file in the root directory of the repository after the installation.

Choose `sql` or `langchain` as the compute backend. This will allow you to use the full functionality of Data Copilot. The getting_started compute backend is a limited version which will help you to get started with implementing your own logic. 
Checkout the `Build your own Copilot` section for more information.



.. image:: https://raw.githubusercontent.com/Modulos/data_copilot/main/assets/login_page.png
   :align: center
   :width: 100%


🐍 Install from PyPI
====================

In the current implementation you also need to install redis first. 


For Linux


.. code-block:: bash

  sudo apt install redis


For Mac


.. code-block:: bash

  brew install redis


First make sure to have python3.10 installed. Then run the following command in your terminal:


.. code-block:: bash

  mkdir data_copilot
  cd data_copilot
  python3.10 -m venv venv
  source venv/bin/activate
  pip install data-copilot
  data-copilot run

**If you run data-copilot like this, you can open open Data Copilot in your browser under port 8080: http://localhost:8080**


Maintaining and Updating
------------------------

Running Data Copilot in the Docker setup can be done by either running `make run` or `make run-dev`. In the dev setup hot-reloading is activated for your code. 

To reset the databse you can run `make reset-db` in the root directory of the repository. This will drop all tables and create them again.


πŸ›οΈ Architecture
===============

.. image:: https://raw.githubusercontent.com/Modulos/data_copilot/main/assets/architecture.svg
   :align: center
   :width: 100%

The Data Copilot system is composed of several services, each running in its own Docker container. These services interact to provide a comprehensive data processing and management solution. The number in brackets indicates the exposed port for each service. The number after the colon indicates the internal port used by the service.

- **Nginx:** This service acts as a reverse proxy for the backend and adminer services. It uses the `data-copilot-nginx` Docker image and listens on port 80.

- **Database:** This service runs a PostgreSQL database server, using the `postgres:latest` Docker image. The database data is stored in a Docker volume for persistence.

- **Frontend:** The user interface of the application is provided by the frontend service, using the `data-copilot-frontend` Docker image. The frontend framework is Vue3.

- **Backend:** The main application logic is handled by the backend service. It uses the `data-copilot-backend` Docker image and interacts with the database. The backend framework is `FastAPI <https://github.com/tiangolo/fastapi>`_.

- **Adminer:** This service provides a web interface for managing the PostgreSQL database. It uses the `adminer` Docker image.

- **Redis Queue:** This service manages a job queue for asynchronous tasks. It uses the `redis:alpine` Docker image.

- **Celery Worker:** This service executes the asynchronous tasks from the Redis queue. It uses the `data-copilot-celery-worker` Docker image.

- **Flower:** This service provides a web interface for monitoring the Celery worker tasks. It uses the `data-copilot-celery-flower` Docker image.

The services are interconnected, with data flowing between them as necessary. This architecture allows for scalability, as each component can be scaled independently as per the workload.


πŸ”§ Development
==============

Storage
-------

By default, Data Copilot uses local storage for data persistence. The data folder is named `shared-fs` and is created in your current working directory. This setup should be sufficient for most development tasks.

However, for more extensive data handling, Data Copilot supports Azure as a storage backend. This allows you to utilize Azure's scalable and secure storage solutions for your data.

If you choose to use Azure as your storage backend, you will need to set the following environment variables in the `.dev.env` file:

- `AZURE_STORAGE_ACCOUNT_KEY`: Your Azure storage account key.
- `AZURE_STORAGE_ACCOUNT_NAME`: Your Azure storage account name.
- `STORAGE_BACKEND`: The URL of your Azure storage container. The URL should be in the following format: `https://{storage_account}.dfs.core.windows.net/{container}/`.

These environment variables configure the connection to your Azure storage account and specify the storage container to use.

Remember to replace `{storage_account}` with your Azure storage account name and `{container}` with the name of your Azure storage container.


Database
--------

Data Copilot uses PostgreSQL as its database. This provides a robust and scalable solution for data management. 

The default environment variables for connecting to the PostgreSQL database are:

- `DB_CONNECTION_STRING`: The connection string for the PostgreSQL database. The default value is `postgresql://postgres:postgres@database:5432/postgres`.

For the PyPi version of Data Copilot, the default value is `sqlite:///data_copilot.db`.


Development and Hot Reloading
-----------------------------

Data Copilot supports hot reloading, which allows you to see the effects of your code changes in real time without needing to manually stop and restart the application. This feature significantly speeds up the development process and provides instant feedback, making it easier to build and iterate on your application.

To start the service with hot reloading enabled, run the following command:

.. code-block:: bash

    make run-dev

This command will start the Data Copilot service in development mode. Now, whenever you make changes to your code, those changes will be immediately reflected in the running application.


πŸš€ Build your own Copilot
=========================


Data Copilot is not just a standalone application, but also a framework that you can use to build your own data processing and analysis tools. Here are the steps to get started:

1. **Logic:** All the logic of your app should be in the `data_copilot/execution_apps/apps` directory. You can modify the logic here to suit your specific needs. You can inherit from the `data_copilot.exection_apps.base.DataCopilotApp` class. You need to implement at least the three static methods:
   - `supported_file_types`: This method should return a dict of the supported file types. The keys should be the identifier and the value the content-type of the file type.
   - `process_data_upload`: This method gets a list of the FastAPI `UploadFile` objects and should return a list of dict where the key is the file name and the value the content of the file as BufferedIOBase
   - `execute_message`: This method contnains the execution logic of your app, which gets executed on the worker.

2. **Message** The execution_message should return a `data_copilot.execution_apps.helpers.Message` object. 
   
  
With these steps, you can customize Data Copilot to handle your specific data processing and analysis tasks. Remember to thoroughly test your changes to ensure they work as expected.


Build Python Package
--------------------

To build the python package, first build the frontend with the following command once to install the npm dependencies:

.. code-block:: bash

    cd data_copilot/frontend
    npm install
    cd ../../


Then run the following command to build the python package:

.. code-block:: bash

    make dist

Data Copilot Trademark
======================
Data Copilot is a trademark of Modulos AG. 


Current Maintainers
===================
- `Tim Rohner <https://github.com/leokster>`_
- `Dennis Turp <https://github.com/mdturp>`_

Contributors
============

.. list-table::
   :header-rows: 1

   * - Project Leads
     - Backend
     - DevOps
     - Frontend
     - Design
   * - `Dennis Turp <https://github.com/mdturp>`_
     - `Tim Rohner <https://github.com/leokster>`_
     - `Jiri Kralik <https://github.com/jirikralik>`_
     - `Dennis Turp <https://github.com/mdturp>`_
     - `Celina Jong <https://github.com/celinajong>`_
   * - `Tim Rohner <https://github.com/leokster>`_
     - `Dennis Turp <https://github.com/mdturp>`_
     - `Serhii  Kyslyi <https://github.com/serhiikyslyi>`_
     - `Oleh Lukashchuk <https://github.com/Olehlukashchuk96>`_
     - 
   * - 
     - `Michael RΓΆthlisberger <https://github/roethlisbergermichael>`_
     - `Keven Le Moing <https://github.com/KevenLeMoing>`_
     - 
     - 
   * -
     - `Keven Le Moing <https://github.com/KevenLeMoing>`_
     -  
     -  
     -  
   * -
     - `Severin Husmann <https://github.com/serced>`_
     -
     -
     -
   * - 
     - `Andrei Vaduva <https://github.com/andreiv-dev>`_
     - 
     - 
     - 
   * - 
     - `Dominic Stark <https://github.com/dominicstark>`_
     - 
     - 
     - 
   * - 
     - `Tomasz Kucharski <https://github.com/tomkuch>`_
     - 
     - 
     - 



            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "data-copilot",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "",
    "keywords": "data_copilot",
    "author": "",
    "author_email": "Modulos AG <contact@modulos.ai>",
    "download_url": "https://files.pythonhosted.org/packages/b0/95/44c22d01c6f72345250390f83cb0a2a72bd6aefbd6424a2c4fb1f30107bb/data_copilot-0.1.2.tar.gz",
    "platform": null,
    "description": ".. image:: https://raw.githubusercontent.com/Modulos/data_copilot/main/data_copilot/frontend/src/assets/icons/svg/logo-landing.svg\n   :target: #\n   :align: center\n   :width: 100%\n\n--------\n\n\n.. image:: https://pyup.io/repos/github/Modulos/data_copilot/shield.svg\n     :target: https://pyup.io/repos/github/Modulos/data_copilot/\n     :alt: Updates\n\n.. image:: https://img.shields.io/badge/python-3.10-blue\n     :target: #\n     :alt: Python Version 3.10\n\n.. image:: https://img.shields.io/badge/code%20style-black-000000.svg\n     :target: https://github.com/psf/black\n\n.. image:: https://github.com/Modulos/data_copilot/actions/workflows/pr-test.yml/badge.svg?event=push\n     :target: https://github.com/Modulos/data_copilot/actions/workflows/pr-test.yml\n\n.. image:: https://img.shields.io/discord/1110586557963972618?color=%237289DA&label=DataCopilot&logo=discord&logoColor=white\n   :target: https://discord.gg/muerW29z\n\n--------\n\n\ud83d\ude80 Welcome to Data Copilot!\n===========================\n\nData Copilot is not just another data analysis software. It is an end-to-end, scalable, and docker-based solution engineered to revolutionize the way you engage with data. As a comprehensive platform, it marries frontend, backend, and execution functionalities into a seamless user experience. Whether you're dealing with CSV or XLSX files, simply upload your data and start asking questions. With Data Copilot, you are not just analyzing data, you're conversing with it. It goes beyond being a mere tool - it's your co-pilot on the journey to unlock meaningful insights from complex data. What's more? It's a framework that allows you to build your own prompt-based applications, adding an extra dimension to user interaction. And with exciting updates on the horizon, the possibilities are limitless.\n\nHere's what makes Data Copilot your go-to data analysis companion:\n\n- \ud83d\udcca Streamlined Data Analysis: Designed to streamline data analysis, making it more efficient and accessible.\n- \ud83d\udea2 Docker-Based: Leverage the power of containerization to ensure scalability and easy deployment.\n- \ud83d\udcd1 Multi-Format Support: From CSV to XLSX, upload files in various formats and interactively analyze them, easily extending to other formats.\n- \ud83d\udcac Interactive Querying: Transform complex data into understandable insights through interactive queries.\n- \ud83d\udee0\ufe0f Customizable Framework: A robust platform that lets you build your own prompt-based applications for enhanced user experience.\n- \ud83d\udcc8 Future-Proof: Stay tuned for future updates that promise to further enhance its versatility and utility in data management.\n\n\ud83d\udd11 Prerequisites\n================\n\nBefore you can install Data Copilot, you must have an OpenAI API key. You can get one by signing up for an account at `openai.com <https://beta.openai.com/signup>`_. Once you have an API key, you can proceed with the installation.\n\n\n\ud83d\udc33 Installation (with Docker)\n=============================\n\nBefore you can install Data Copilot, you need to make sure you have the following tools installed:\n\n- `Docker <https://docs.docker.com/get-docker/>`_\n- `Docker Compose <https://docs.docker.com/compose/install/>`_\n- `Python3 <https://www.python.org/downloads/>`_\n\nEach of these tools has its own installation guide. Follow the links to get instructions for your specific operating system (Windows, Mac, or Linux).\n\n**Cloning and Setting Up**\n\nOnce you have Docker, Docker Compose, and Python3 installed, you can download and set up Data Copilot. Run the following commands in your terminal:\n\n.. code-block:: bash\n\n    git clone https://github.com/modulos/data_copilot.git\n    cd data_copilot\n    pip install \".[dev]\"\n    make setup\n\n**Open Data Copilot in your browser: http://localhost:80**\n\n\nThese commands will clone the Data Copilot repository and run the setup process.\n\nDuring the setup process, you will be prompted to enter your openai API key. You can also enter it manually by editing the ``.dev.env`` file in the root directory of the repository after the installation.\n\nChoose `sql` or `langchain` as the compute backend. This will allow you to use the full functionality of Data Copilot. The getting_started compute backend is a limited version which will help you to get started with implementing your own logic. \nCheckout the `Build your own Copilot` section for more information.\n\n\n\n.. image:: https://raw.githubusercontent.com/Modulos/data_copilot/main/assets/login_page.png\n   :align: center\n   :width: 100%\n\n\n\ud83d\udc0d Install from PyPI\n====================\n\nIn the current implementation you also need to install redis first. \n\n\nFor Linux\n\n\n.. code-block:: bash\n\n  sudo apt install redis\n\n\nFor Mac\n\n\n.. code-block:: bash\n\n  brew install redis\n\n\nFirst make sure to have python3.10 installed. Then run the following command in your terminal:\n\n\n.. code-block:: bash\n\n  mkdir data_copilot\n  cd data_copilot\n  python3.10 -m venv venv\n  source venv/bin/activate\n  pip install data-copilot\n  data-copilot run\n\n**If you run data-copilot like this, you can open open Data Copilot in your browser under port 8080: http://localhost:8080**\n\n\nMaintaining and Updating\n------------------------\n\nRunning Data Copilot in the Docker setup can be done by either running `make run` or `make run-dev`. In the dev setup hot-reloading is activated for your code. \n\nTo reset the databse you can run `make reset-db` in the root directory of the repository. This will drop all tables and create them again.\n\n\n\ud83c\udfdb\ufe0f Architecture\n===============\n\n.. image:: https://raw.githubusercontent.com/Modulos/data_copilot/main/assets/architecture.svg\n   :align: center\n   :width: 100%\n\nThe Data Copilot system is composed of several services, each running in its own Docker container. These services interact to provide a comprehensive data processing and management solution. The number in brackets indicates the exposed port for each service. The number after the colon indicates the internal port used by the service.\n\n- **Nginx:** This service acts as a reverse proxy for the backend and adminer services. It uses the `data-copilot-nginx` Docker image and listens on port 80.\n\n- **Database:** This service runs a PostgreSQL database server, using the `postgres:latest` Docker image. The database data is stored in a Docker volume for persistence.\n\n- **Frontend:** The user interface of the application is provided by the frontend service, using the `data-copilot-frontend` Docker image. The frontend framework is Vue3.\n\n- **Backend:** The main application logic is handled by the backend service. It uses the `data-copilot-backend` Docker image and interacts with the database. The backend framework is `FastAPI <https://github.com/tiangolo/fastapi>`_.\n\n- **Adminer:** This service provides a web interface for managing the PostgreSQL database. It uses the `adminer` Docker image.\n\n- **Redis Queue:** This service manages a job queue for asynchronous tasks. It uses the `redis:alpine` Docker image.\n\n- **Celery Worker:** This service executes the asynchronous tasks from the Redis queue. It uses the `data-copilot-celery-worker` Docker image.\n\n- **Flower:** This service provides a web interface for monitoring the Celery worker tasks. It uses the `data-copilot-celery-flower` Docker image.\n\nThe services are interconnected, with data flowing between them as necessary. This architecture allows for scalability, as each component can be scaled independently as per the workload.\n\n\n\ud83d\udd27 Development\n==============\n\nStorage\n-------\n\nBy default, Data Copilot uses local storage for data persistence. The data folder is named `shared-fs` and is created in your current working directory. This setup should be sufficient for most development tasks.\n\nHowever, for more extensive data handling, Data Copilot supports Azure as a storage backend. This allows you to utilize Azure's scalable and secure storage solutions for your data.\n\nIf you choose to use Azure as your storage backend, you will need to set the following environment variables in the `.dev.env` file:\n\n- `AZURE_STORAGE_ACCOUNT_KEY`: Your Azure storage account key.\n- `AZURE_STORAGE_ACCOUNT_NAME`: Your Azure storage account name.\n- `STORAGE_BACKEND`: The URL of your Azure storage container. The URL should be in the following format: `https://{storage_account}.dfs.core.windows.net/{container}/`.\n\nThese environment variables configure the connection to your Azure storage account and specify the storage container to use.\n\nRemember to replace `{storage_account}` with your Azure storage account name and `{container}` with the name of your Azure storage container.\n\n\nDatabase\n--------\n\nData Copilot uses PostgreSQL as its database. This provides a robust and scalable solution for data management. \n\nThe default environment variables for connecting to the PostgreSQL database are:\n\n- `DB_CONNECTION_STRING`: The connection string for the PostgreSQL database. The default value is `postgresql://postgres:postgres@database:5432/postgres`.\n\nFor the PyPi version of Data Copilot, the default value is `sqlite:///data_copilot.db`.\n\n\nDevelopment and Hot Reloading\n-----------------------------\n\nData Copilot supports hot reloading, which allows you to see the effects of your code changes in real time without needing to manually stop and restart the application. This feature significantly speeds up the development process and provides instant feedback, making it easier to build and iterate on your application.\n\nTo start the service with hot reloading enabled, run the following command:\n\n.. code-block:: bash\n\n    make run-dev\n\nThis command will start the Data Copilot service in development mode. Now, whenever you make changes to your code, those changes will be immediately reflected in the running application.\n\n\n\ud83d\ude80 Build your own Copilot\n=========================\n\n\nData Copilot is not just a standalone application, but also a framework that you can use to build your own data processing and analysis tools. Here are the steps to get started:\n\n1. **Logic:** All the logic of your app should be in the `data_copilot/execution_apps/apps` directory. You can modify the logic here to suit your specific needs. You can inherit from the `data_copilot.exection_apps.base.DataCopilotApp` class. You need to implement at least the three static methods:\n   - `supported_file_types`: This method should return a dict of the supported file types. The keys should be the identifier and the value the content-type of the file type.\n   - `process_data_upload`: This method gets a list of the FastAPI `UploadFile` objects and should return a list of dict where the key is the file name and the value the content of the file as BufferedIOBase\n   - `execute_message`: This method contnains the execution logic of your app, which gets executed on the worker.\n\n2. **Message** The execution_message should return a `data_copilot.execution_apps.helpers.Message` object. \n   \n  \nWith these steps, you can customize Data Copilot to handle your specific data processing and analysis tasks. Remember to thoroughly test your changes to ensure they work as expected.\n\n\nBuild Python Package\n--------------------\n\nTo build the python package, first build the frontend with the following command once to install the npm dependencies:\n\n.. code-block:: bash\n\n    cd data_copilot/frontend\n    npm install\n    cd ../../\n\n\nThen run the following command to build the python package:\n\n.. code-block:: bash\n\n    make dist\n\nData Copilot Trademark\n======================\nData Copilot is a trademark of Modulos AG. \n\n\nCurrent Maintainers\n===================\n- `Tim Rohner <https://github.com/leokster>`_\n- `Dennis Turp <https://github.com/mdturp>`_\n\nContributors\n============\n\n.. list-table::\n   :header-rows: 1\n\n   * - Project Leads\n     - Backend\n     - DevOps\n     - Frontend\n     - Design\n   * - `Dennis Turp <https://github.com/mdturp>`_\n     - `Tim Rohner <https://github.com/leokster>`_\n     - `Jiri Kralik <https://github.com/jirikralik>`_\n     - `Dennis Turp <https://github.com/mdturp>`_\n     - `Celina Jong <https://github.com/celinajong>`_\n   * - `Tim Rohner <https://github.com/leokster>`_\n     - `Dennis Turp <https://github.com/mdturp>`_\n     - `Serhii  Kyslyi <https://github.com/serhiikyslyi>`_\n     - `Oleh Lukashchuk <https://github.com/Olehlukashchuk96>`_\n     - \n   * - \n     - `Michael R\u00f6thlisberger <https://github/roethlisbergermichael>`_\n     - `Keven Le Moing <https://github.com/KevenLeMoing>`_\n     - \n     - \n   * -\n     - `Keven Le Moing <https://github.com/KevenLeMoing>`_\n     -  \n     -  \n     -  \n   * -\n     - `Severin Husmann <https://github.com/serced>`_\n     -\n     -\n     -\n   * - \n     - `Andrei Vaduva <https://github.com/andreiv-dev>`_\n     - \n     - \n     - \n   * - \n     - `Dominic Stark <https://github.com/dominicstark>`_\n     - \n     - \n     - \n   * - \n     - `Tomasz Kucharski <https://github.com/tomkuch>`_\n     - \n     - \n     - \n\n\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "The data_copilot web application",
    "version": "0.1.2",
    "project_urls": {
        "Bug Reports": "https://github.com/Modulos/data_copilot/issues",
        "Homepage": "https://github.com/Modulos/data_copilot/",
        "Source": "https://github.com/Modulos/data_copilot/"
    },
    "split_keywords": [
        "data_copilot"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "07455a4decc78041ab5afac69c5940f840ad3d0dc12763e66c85dd67786cfc0d",
                "md5": "675d68f77c3660203e98b89365f750a6",
                "sha256": "5efd66b309fe014ef2855e695aa1afdd81eaff8873192fdaaed7fc7794d2a0c3"
            },
            "downloads": -1,
            "filename": "data_copilot-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "675d68f77c3660203e98b89365f750a6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 13974766,
            "upload_time": "2023-06-13T07:12:06",
            "upload_time_iso_8601": "2023-06-13T07:12:06.972299Z",
            "url": "https://files.pythonhosted.org/packages/07/45/5a4decc78041ab5afac69c5940f840ad3d0dc12763e66c85dd67786cfc0d/data_copilot-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b09544c22d01c6f72345250390f83cb0a2a72bd6aefbd6424a2c4fb1f30107bb",
                "md5": "21faedf961d26e7eb3bbacf810d4a09c",
                "sha256": "45f5f327fc99f953eaf40562728a87101475c63bef883d8d9b1723caf362cba6"
            },
            "downloads": -1,
            "filename": "data_copilot-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "21faedf961d26e7eb3bbacf810d4a09c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 17279295,
            "upload_time": "2023-06-13T07:12:10",
            "upload_time_iso_8601": "2023-06-13T07:12:10.965389Z",
            "url": "https://files.pythonhosted.org/packages/b0/95/44c22d01c6f72345250390f83cb0a2a72bd6aefbd6424a2c4fb1f30107bb/data_copilot-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-13 07:12:10",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Modulos",
    "github_project": "data_copilot",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "tox": true,
    "lcname": "data-copilot"
}
        
Elapsed time: 0.15572s