datacube


Namedatacube JSON
Version 1.8.18 PyPI version JSON
download
home_pagehttps://github.com/opendatacube/datacube-core
SummaryAn analysis environment for satellite and other earth observation data
upload_time2024-03-27 01:06:24
maintainerOpen Data Cube
docs_urlNone
authorOpen Data Cube
requires_python>=3.9.0
licenseApache License 2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            Open Data Cube Core
===================

.. image:: https://github.com/opendatacube/datacube-core/workflows/build/badge.svg
    :alt: Build Status
    :target: https://github.com/opendatacube/datacube-core/actions

.. image:: https://codecov.io/gh/opendatacube/datacube-core/branch/develop/graph/badge.svg
    :alt: Coverage Status
    :target: https://codecov.io/gh/opendatacube/datacube-core

.. image:: https://readthedocs.org/projects/datacube-core/badge/?version=latest
    :alt: Documentation Status
    :target: http://datacube-core.readthedocs.org/en/latest/

Overview
========

The Open Data Cube Core provides an integrated gridded data
analysis environment for decades of analysis ready earth observation
satellite and related data from multiple satellite and other acquisition
systems.

Documentation
=============

See the `user guide <http://datacube-core.readthedocs.io/en/latest/>`__ for
installation and usage of the datacube, and for documentation of the API.

`Join our Slack <http://slack.opendatacube.org>`__ if you need help
setting up or using the Open Data Cube.

Please help us to keep the Open Data Cube community open and inclusive by
reading and following our `Code of Conduct <code-of-conduct.md>`__.

Requirements
============

System
~~~~~~

-  PostgreSQL 10+
-  Python 3.9+

Developer setup
===============

1. Clone:

   -  ``git clone https://github.com/opendatacube/datacube-core.git``

2. Create a Python environment for using the ODC.  We recommend `Mambaforge <https://mamba.readthedocs.io/en/latest/user_guide/mamba.html>`__ as the
   easiest way to handle Python dependencies.

::

   mamba env create -f conda-environment.yml
   conda activate cubeenv

3. Install a develop version of datacube-core.

::

   cd datacube-core
   pip install --upgrade -e .

4. Install the `pre-commit <https://pre-commit.com>`__ hooks to help follow ODC coding
   conventions when committing with git.

::

   pre-commit install

5. Run unit tests + PyLint

Install test dependencies using:

   ``pip install --upgrade -e '.[test]'``

If install for these fails, please lodge them as issues.

Run unit tests with:

   ``./check-code.sh``

   (this script approximates what is run by GitHub Actions. You can
   alternatively run ``pytest`` yourself).

6. **(or)** Run all tests, including integration tests.

   ``./check-code.sh integration_tests``

   -  Assumes a password-less Postgres database running on localhost called

   ``pgintegration``

   -  Otherwise copy ``integration_tests/integration.conf`` to
      ``~/.datacube_integration.conf`` and edit to customise.

   - For instructions on setting up a password-less Postgres database, see
      the `developer setup instructions <https://datacube-core.readthedocs.io/en/latest/installation/setup/ubuntu.html#postgres-database-configuration>`__.


Alternatively one can use the ``opendatacube/datacube-tests`` docker image to run
tests. This docker includes database server pre-configured for running
integration tests. Add ``--with-docker`` command line option as a first argument
to ``./check-code.sh`` script.

::

   ./check-code.sh --with-docker integration_tests


To run individual tests in a docker container

::

    docker build --tag=opendatacube/datacube-tests-local --no-cache --progress plain -f docker/Dockerfile .

    docker run -ti -v $(pwd):/code opendatacube/datacube-tests-local:latest pytest integration_tests/test_filename.py::test_function_name


Developer setup on Ubuntu
~~~~~~~~~~~~~~~~~~~~~~~~~

Building a Python virtual environment on Ubuntu suitable for development work.

Install dependencies:

::

    sudo apt-get update
    sudo apt-get install -y \
        autoconf automake build-essential make cmake \
        graphviz \
        python3-venv \
        python3-dev \
        libpq-dev \
        libyaml-dev \
        libnetcdf-dev \
        libudunits2-dev


Build the python virtual environment:

::

    pyenv="${HOME}/.envs/odc"  # Change to suit your needs
    mkdir -p "${pyenv}"
    python3 -m venv "${pyenv}"
    source "${pyenv}/bin/activate"
    pip install -U pip wheel cython numpy
    pip install -e '.[dev]'
    pip install flake8 mypy pylint autoflake black



            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/opendatacube/datacube-core",
    "name": "datacube",
    "maintainer": "Open Data Cube",
    "docs_url": null,
    "requires_python": ">=3.9.0",
    "maintainer_email": null,
    "keywords": null,
    "author": "Open Data Cube",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/cb/8e/e7a9c029614e322a307eca7425fd685ba1304c998cbbb77cdc73aaf4b39f/datacube-1.8.18.tar.gz",
    "platform": null,
    "description": "Open Data Cube Core\n===================\n\n.. image:: https://github.com/opendatacube/datacube-core/workflows/build/badge.svg\n    :alt: Build Status\n    :target: https://github.com/opendatacube/datacube-core/actions\n\n.. image:: https://codecov.io/gh/opendatacube/datacube-core/branch/develop/graph/badge.svg\n    :alt: Coverage Status\n    :target: https://codecov.io/gh/opendatacube/datacube-core\n\n.. image:: https://readthedocs.org/projects/datacube-core/badge/?version=latest\n    :alt: Documentation Status\n    :target: http://datacube-core.readthedocs.org/en/latest/\n\nOverview\n========\n\nThe Open Data Cube Core provides an integrated gridded data\nanalysis environment for decades of analysis ready earth observation\nsatellite and related data from multiple satellite and other acquisition\nsystems.\n\nDocumentation\n=============\n\nSee the `user guide <http://datacube-core.readthedocs.io/en/latest/>`__ for\ninstallation and usage of the datacube, and for documentation of the API.\n\n`Join our Slack <http://slack.opendatacube.org>`__ if you need help\nsetting up or using the Open Data Cube.\n\nPlease help us to keep the Open Data Cube community open and inclusive by\nreading and following our `Code of Conduct <code-of-conduct.md>`__.\n\nRequirements\n============\n\nSystem\n~~~~~~\n\n-  PostgreSQL 10+\n-  Python 3.9+\n\nDeveloper setup\n===============\n\n1. Clone:\n\n   -  ``git clone https://github.com/opendatacube/datacube-core.git``\n\n2. Create a Python environment for using the ODC.  We recommend `Mambaforge <https://mamba.readthedocs.io/en/latest/user_guide/mamba.html>`__ as the\n   easiest way to handle Python dependencies.\n\n::\n\n   mamba env create -f conda-environment.yml\n   conda activate cubeenv\n\n3. Install a develop version of datacube-core.\n\n::\n\n   cd datacube-core\n   pip install --upgrade -e .\n\n4. Install the `pre-commit <https://pre-commit.com>`__ hooks to help follow ODC coding\n   conventions when committing with git.\n\n::\n\n   pre-commit install\n\n5. Run unit tests + PyLint\n\nInstall test dependencies using:\n\n   ``pip install --upgrade -e '.[test]'``\n\nIf install for these fails, please lodge them as issues.\n\nRun unit tests with:\n\n   ``./check-code.sh``\n\n   (this script approximates what is run by GitHub Actions. You can\n   alternatively run ``pytest`` yourself).\n\n6. **(or)** Run all tests, including integration tests.\n\n   ``./check-code.sh integration_tests``\n\n   -  Assumes a password-less Postgres database running on localhost called\n\n   ``pgintegration``\n\n   -  Otherwise copy ``integration_tests/integration.conf`` to\n      ``~/.datacube_integration.conf`` and edit to customise.\n\n   - For instructions on setting up a password-less Postgres database, see\n      the `developer setup instructions <https://datacube-core.readthedocs.io/en/latest/installation/setup/ubuntu.html#postgres-database-configuration>`__.\n\n\nAlternatively one can use the ``opendatacube/datacube-tests`` docker image to run\ntests. This docker includes database server pre-configured for running\nintegration tests. Add ``--with-docker`` command line option as a first argument\nto ``./check-code.sh`` script.\n\n::\n\n   ./check-code.sh --with-docker integration_tests\n\n\nTo run individual tests in a docker container\n\n::\n\n    docker build --tag=opendatacube/datacube-tests-local --no-cache --progress plain -f docker/Dockerfile .\n\n    docker run -ti -v $(pwd):/code opendatacube/datacube-tests-local:latest pytest integration_tests/test_filename.py::test_function_name\n\n\nDeveloper setup on Ubuntu\n~~~~~~~~~~~~~~~~~~~~~~~~~\n\nBuilding a Python virtual environment on Ubuntu suitable for development work.\n\nInstall dependencies:\n\n::\n\n    sudo apt-get update\n    sudo apt-get install -y \\\n        autoconf automake build-essential make cmake \\\n        graphviz \\\n        python3-venv \\\n        python3-dev \\\n        libpq-dev \\\n        libyaml-dev \\\n        libnetcdf-dev \\\n        libudunits2-dev\n\n\nBuild the python virtual environment:\n\n::\n\n    pyenv=\"${HOME}/.envs/odc\"  # Change to suit your needs\n    mkdir -p \"${pyenv}\"\n    python3 -m venv \"${pyenv}\"\n    source \"${pyenv}/bin/activate\"\n    pip install -U pip wheel cython numpy\n    pip install -e '.[dev]'\n    pip install flake8 mypy pylint autoflake black\n\n\n",
    "bugtrack_url": null,
    "license": "Apache License 2.0",
    "summary": "An analysis environment for satellite and other earth observation data",
    "version": "1.8.18",
    "project_urls": {
        "Homepage": "https://github.com/opendatacube/datacube-core"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "61114f00968cc970839750b1a55d47a804e172b035dc99ab66fd40d901d2f3ba",
                "md5": "4b49a93d7a4f029fa21ee225dfce2885",
                "sha256": "05255c50494bf1f6b82b3ed5b4198ce17b80c781dce581eaab089bb0b96be147"
            },
            "downloads": -1,
            "filename": "datacube-1.8.18-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4b49a93d7a4f029fa21ee225dfce2885",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": ">=3.9.0",
            "size": 376642,
            "upload_time": "2024-03-27T01:06:21",
            "upload_time_iso_8601": "2024-03-27T01:06:21.961246Z",
            "url": "https://files.pythonhosted.org/packages/61/11/4f00968cc970839750b1a55d47a804e172b035dc99ab66fd40d901d2f3ba/datacube-1.8.18-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cb8ee7a9c029614e322a307eca7425fd685ba1304c998cbbb77cdc73aaf4b39f",
                "md5": "35f1196a74933bf8c5c1fbdef7e890d1",
                "sha256": "61aaa7d5c8107483f83c49f690dc65fab52b62f6d6eb3a4bbf0c6b1145c3a745"
            },
            "downloads": -1,
            "filename": "datacube-1.8.18.tar.gz",
            "has_sig": false,
            "md5_digest": "35f1196a74933bf8c5c1fbdef7e890d1",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9.0",
            "size": 2104589,
            "upload_time": "2024-03-27T01:06:24",
            "upload_time_iso_8601": "2024-03-27T01:06:24.905256Z",
            "url": "https://files.pythonhosted.org/packages/cb/8e/e7a9c029614e322a307eca7425fd685ba1304c998cbbb77cdc73aaf4b39f/datacube-1.8.18.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-27 01:06:24",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "opendatacube",
    "github_project": "datacube-core",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "lcname": "datacube"
}
        
Elapsed time: 0.24002s