datacube


Namedatacube JSON
Version 1.8.19 PyPI version JSON
download
home_pagehttps://github.com/opendatacube/datacube-core
SummaryAn analysis environment for satellite and other earth observation data
upload_time2024-07-02 07:19:16
maintainerOpen Data Cube
docs_urlNone
authorOpen Data Cube
requires_python>=3.9.0
licenseApache License 2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            Open Data Cube Core
===================

.. image:: https://github.com/opendatacube/datacube-core/workflows/build/badge.svg
    :alt: Build Status
    :target: https://github.com/opendatacube/datacube-core/actions

.. image:: https://codecov.io/gh/opendatacube/datacube-core/branch/develop/graph/badge.svg
    :alt: Coverage Status
    :target: https://codecov.io/gh/opendatacube/datacube-core

.. image:: https://readthedocs.org/projects/datacube-core/badge/?version=latest
    :alt: Documentation Status
    :target: http://datacube-core.readthedocs.org/en/latest/

Overview
========

The Open Data Cube Core provides an integrated gridded data
analysis environment for decades of analysis ready earth observation
satellite and related data from multiple satellite and other acquisition
systems.

Documentation
=============

See the `user guide <http://datacube-core.readthedocs.io/en/latest/>`__ for
installation and usage of the datacube, and for documentation of the API.

`Join our Slack <http://slack.opendatacube.org>`__ if you need help
setting up or using the Open Data Cube.

Please help us to keep the Open Data Cube community open and inclusive by
reading and following our `Code of Conduct <code-of-conduct.md>`__.

Requirements
============

System
~~~~~~

-  PostgreSQL 10+
-  Python 3.9+

Developer setup
===============

1. Clone:

   -  ``git clone https://github.com/opendatacube/datacube-core.git``

2. Create a Python environment for using the ODC.  We recommend `Mambaforge <https://mamba.readthedocs.io/en/latest/user_guide/mamba.html>`__ as the
   easiest way to handle Python dependencies.

::

   mamba env create -f conda-environment.yml
   conda activate cubeenv

3. Install a develop version of datacube-core.

::

   cd datacube-core
   pip install --upgrade -e .

4. Install the `pre-commit <https://pre-commit.com>`__ hooks to help follow ODC coding
   conventions when committing with git.

::

   pre-commit install

5. Run unit tests + PyLint

Install test dependencies using:

   ``pip install --upgrade -e '.[test]'``

If install for these fails, please lodge them as issues.

Run unit tests with:

   ``./check-code.sh``

   (this script approximates what is run by GitHub Actions. You can
   alternatively run ``pytest`` yourself).

6. **(or)** Run all tests, including integration tests.

   ``./check-code.sh integration_tests``

   -  Assumes a password-less Postgres database running on localhost called

   ``pgintegration``

   -  Otherwise copy ``integration_tests/integration.conf`` to
      ``~/.datacube_integration.conf`` and edit to customise.

   - For instructions on setting up a password-less Postgres database, see
      the `developer setup instructions <https://datacube-core.readthedocs.io/en/latest/installation/setup/ubuntu.html#postgres-database-configuration>`__.


Alternatively one can use the ``opendatacube/datacube-tests`` docker image to run
tests. This docker includes database server pre-configured for running
integration tests. Add ``--with-docker`` command line option as a first argument
to ``./check-code.sh`` script.

::

   ./check-code.sh --with-docker integration_tests


To run individual tests in a docker container

::

    docker build --tag=opendatacube/datacube-tests-local --no-cache --progress plain -f docker/Dockerfile .

    docker run -ti -v $(pwd):/code opendatacube/datacube-tests-local:latest pytest integration_tests/test_filename.py::test_function_name


Developer setup on Ubuntu
~~~~~~~~~~~~~~~~~~~~~~~~~

Building a Python virtual environment on Ubuntu suitable for development work.

Install dependencies:

::

    sudo apt-get update
    sudo apt-get install -y \
        autoconf automake build-essential make cmake \
        graphviz \
        python3-venv \
        python3-dev \
        libpq-dev \
        libyaml-dev \
        libnetcdf-dev \
        libudunits2-dev


Build the python virtual environment:

::

    pyenv="${HOME}/.envs/odc"  # Change to suit your needs
    mkdir -p "${pyenv}"
    python3 -m venv "${pyenv}"
    source "${pyenv}/bin/activate"
    pip install -U pip wheel cython numpy
    pip install -e '.[dev]'
    pip install flake8 mypy pylint autoflake black

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/opendatacube/datacube-core",
    "name": "datacube",
    "maintainer": "Open Data Cube",
    "docs_url": null,
    "requires_python": ">=3.9.0",
    "maintainer_email": null,
    "keywords": null,
    "author": "Open Data Cube",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/11/29/9bbfb750a667a58b9d63699e1c53428f0814740fc4aa8d93ee4e2cb81a5e/datacube-1.8.19.tar.gz",
    "platform": null,
    "description": "Open Data Cube Core\n===================\n\n.. image:: https://github.com/opendatacube/datacube-core/workflows/build/badge.svg\n    :alt: Build Status\n    :target: https://github.com/opendatacube/datacube-core/actions\n\n.. image:: https://codecov.io/gh/opendatacube/datacube-core/branch/develop/graph/badge.svg\n    :alt: Coverage Status\n    :target: https://codecov.io/gh/opendatacube/datacube-core\n\n.. image:: https://readthedocs.org/projects/datacube-core/badge/?version=latest\n    :alt: Documentation Status\n    :target: http://datacube-core.readthedocs.org/en/latest/\n\nOverview\n========\n\nThe Open Data Cube Core provides an integrated gridded data\nanalysis environment for decades of analysis ready earth observation\nsatellite and related data from multiple satellite and other acquisition\nsystems.\n\nDocumentation\n=============\n\nSee the `user guide <http://datacube-core.readthedocs.io/en/latest/>`__ for\ninstallation and usage of the datacube, and for documentation of the API.\n\n`Join our Slack <http://slack.opendatacube.org>`__ if you need help\nsetting up or using the Open Data Cube.\n\nPlease help us to keep the Open Data Cube community open and inclusive by\nreading and following our `Code of Conduct <code-of-conduct.md>`__.\n\nRequirements\n============\n\nSystem\n~~~~~~\n\n-  PostgreSQL 10+\n-  Python 3.9+\n\nDeveloper setup\n===============\n\n1. Clone:\n\n   -  ``git clone https://github.com/opendatacube/datacube-core.git``\n\n2. Create a Python environment for using the ODC.  We recommend `Mambaforge <https://mamba.readthedocs.io/en/latest/user_guide/mamba.html>`__ as the\n   easiest way to handle Python dependencies.\n\n::\n\n   mamba env create -f conda-environment.yml\n   conda activate cubeenv\n\n3. Install a develop version of datacube-core.\n\n::\n\n   cd datacube-core\n   pip install --upgrade -e .\n\n4. Install the `pre-commit <https://pre-commit.com>`__ hooks to help follow ODC coding\n   conventions when committing with git.\n\n::\n\n   pre-commit install\n\n5. Run unit tests + PyLint\n\nInstall test dependencies using:\n\n   ``pip install --upgrade -e '.[test]'``\n\nIf install for these fails, please lodge them as issues.\n\nRun unit tests with:\n\n   ``./check-code.sh``\n\n   (this script approximates what is run by GitHub Actions. You can\n   alternatively run ``pytest`` yourself).\n\n6. **(or)** Run all tests, including integration tests.\n\n   ``./check-code.sh integration_tests``\n\n   -  Assumes a password-less Postgres database running on localhost called\n\n   ``pgintegration``\n\n   -  Otherwise copy ``integration_tests/integration.conf`` to\n      ``~/.datacube_integration.conf`` and edit to customise.\n\n   - For instructions on setting up a password-less Postgres database, see\n      the `developer setup instructions <https://datacube-core.readthedocs.io/en/latest/installation/setup/ubuntu.html#postgres-database-configuration>`__.\n\n\nAlternatively one can use the ``opendatacube/datacube-tests`` docker image to run\ntests. This docker includes database server pre-configured for running\nintegration tests. Add ``--with-docker`` command line option as a first argument\nto ``./check-code.sh`` script.\n\n::\n\n   ./check-code.sh --with-docker integration_tests\n\n\nTo run individual tests in a docker container\n\n::\n\n    docker build --tag=opendatacube/datacube-tests-local --no-cache --progress plain -f docker/Dockerfile .\n\n    docker run -ti -v $(pwd):/code opendatacube/datacube-tests-local:latest pytest integration_tests/test_filename.py::test_function_name\n\n\nDeveloper setup on Ubuntu\n~~~~~~~~~~~~~~~~~~~~~~~~~\n\nBuilding a Python virtual environment on Ubuntu suitable for development work.\n\nInstall dependencies:\n\n::\n\n    sudo apt-get update\n    sudo apt-get install -y \\\n        autoconf automake build-essential make cmake \\\n        graphviz \\\n        python3-venv \\\n        python3-dev \\\n        libpq-dev \\\n        libyaml-dev \\\n        libnetcdf-dev \\\n        libudunits2-dev\n\n\nBuild the python virtual environment:\n\n::\n\n    pyenv=\"${HOME}/.envs/odc\"  # Change to suit your needs\n    mkdir -p \"${pyenv}\"\n    python3 -m venv \"${pyenv}\"\n    source \"${pyenv}/bin/activate\"\n    pip install -U pip wheel cython numpy\n    pip install -e '.[dev]'\n    pip install flake8 mypy pylint autoflake black\n",
    "bugtrack_url": null,
    "license": "Apache License 2.0",
    "summary": "An analysis environment for satellite and other earth observation data",
    "version": "1.8.19",
    "project_urls": {
        "Homepage": "https://github.com/opendatacube/datacube-core"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b482266d9b3200a669aff8ff20c295d1e7c62ddc1e44362ecf13b13649775ee4",
                "md5": "f4728bdb6b779bfbfbdf94e9cee30751",
                "sha256": "ba398da1cd8f484ed99319e3420684f7101cdfa99a86ba011a8a7eb0a8809f79"
            },
            "downloads": -1,
            "filename": "datacube-1.8.19-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f4728bdb6b779bfbfbdf94e9cee30751",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": ">=3.9.0",
            "size": 376891,
            "upload_time": "2024-07-02T07:19:13",
            "upload_time_iso_8601": "2024-07-02T07:19:13.841958Z",
            "url": "https://files.pythonhosted.org/packages/b4/82/266d9b3200a669aff8ff20c295d1e7c62ddc1e44362ecf13b13649775ee4/datacube-1.8.19-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "11299bbfb750a667a58b9d63699e1c53428f0814740fc4aa8d93ee4e2cb81a5e",
                "md5": "ce6e8a50865fff5c45de70c4b491fccc",
                "sha256": "c88f7d31572f732731a2c4f58724bb9c7eef9bfb6437acd9b35cb6158df87782"
            },
            "downloads": -1,
            "filename": "datacube-1.8.19.tar.gz",
            "has_sig": false,
            "md5_digest": "ce6e8a50865fff5c45de70c4b491fccc",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9.0",
            "size": 2106285,
            "upload_time": "2024-07-02T07:19:16",
            "upload_time_iso_8601": "2024-07-02T07:19:16.795101Z",
            "url": "https://files.pythonhosted.org/packages/11/29/9bbfb750a667a58b9d63699e1c53428f0814740fc4aa8d93ee4e2cb81a5e/datacube-1.8.19.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-02 07:19:16",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "opendatacube",
    "github_project": "datacube-core",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "lcname": "datacube"
}
        
Elapsed time: 0.37757s