h5netcdf


Nameh5netcdf JSON
Version 1.2.0 PyPI version JSON
download
home_page
SummarynetCDF4 via h5py
upload_time2023-06-02 05:40:45
maintainer
docs_urlNone
author
requires_python>=3.9
licenseCopyright (c) 2015, h5netcdf developers All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            h5netcdf
========

.. image:: https://github.com/h5netcdf/h5netcdf/workflows/CI/badge.svg
    :target: https://github.com/h5netcdf/h5netcdf/actions
.. image:: https://badge.fury.io/py/h5netcdf.svg
    :target: https://pypi.org/project/h5netcdf/
.. image:: https://github.com/h5netcdf/h5netcdf/actions/workflows/pages/pages-build-deployment/badge.svg?branch=gh-pages
    :target: https://h5netcdf.github.io/h5netcdf/

A Python interface for the `netCDF4`_ file-format that reads and writes local or
remote HDF5 files directly via `h5py`_ or `h5pyd`_, without relying on the Unidata
netCDF library.

.. _netCDF4: https://docs.unidata.ucar.edu/netcdf-c/current/file_format_specifications.html#netcdf_4_spec
.. _h5py: https://www.h5py.org/
.. _h5pyd: https://github.com/HDFGroup/h5pyd


.. why-h5netcdf

Why h5netcdf?
-------------

- It has one less binary dependency (netCDF C). If you already have h5py
  installed, reading netCDF4 with h5netcdf may be much easier than installing
  netCDF4-Python.
- We've seen occasional reports of better performance with h5py than
  netCDF4-python, though in many cases performance is identical. For
  `one workflow`_, h5netcdf was reported to be almost **4x faster** than
  `netCDF4-python`_.
- Anecdotally, HDF5 users seem to be unexcited about switching to netCDF --
  hopefully this will convince them that netCDF4 is actually quite sane!
- Finally, side-stepping the netCDF C library (and Cython bindings to it)
  gives us an easier way to identify the source of performance issues and
  bugs in the netCDF libraries/specification.

.. _one workflow: https://github.com/Unidata/netcdf4-python/issues/390#issuecomment-93864839
.. _xarray: https://github.com/pydata/xarray/

Install
-------

Ensure you have a recent version of h5py installed (I recommend using `conda`_ or
the community effort `conda-forge`_).
At least version 3.0 is required. Then::

    $ pip install h5netcdf

Or if you are already using conda::

    $ conda install h5netcdf

Note:

From version 1.2. h5netcdf tries to align with a `nep29`_-like support policy with regard
to it's upstream dependencies.

.. _conda: https://conda.io/
.. _conda-forge: https://conda-forge.org/
.. _nep29: https://numpy.org/neps/nep-0029-deprecation_policy.html

Usage
-----

h5netcdf has two APIs, a new API and a legacy API. Both interfaces currently
reproduce most of the features of the netCDF interface, with the notable
exception of support for operations that rename or delete existing objects.
We simply haven't gotten around to implementing this yet. Patches
would be very welcome.

New API
~~~~~~~

The new API supports direct hierarchical access of variables and groups. Its
design is an adaptation of h5py to the netCDF data model. For example:

.. code-block:: python

    import h5netcdf
    import numpy as np

    with h5netcdf.File('mydata.nc', 'w') as f:
        # set dimensions with a dictionary
        f.dimensions = {'x': 5}
        # and update them with a dict-like interface
        # f.dimensions['x'] = 5
        # f.dimensions.update({'x': 5})

        v = f.create_variable('hello', ('x',), float)
        v[:] = np.ones(5)

        # you don't need to create groups first
        # you also don't need to create dimensions first if you supply data
        # with the new variable
        v = f.create_variable('/grouped/data', ('y',), data=np.arange(10))

        # access and modify attributes with a dict-like interface
        v.attrs['foo'] = 'bar'

        # you can access variables and groups directly using a hierarchical
        # keys like h5py
        print(f['/grouped/data'])

        # add an unlimited dimension
        f.dimensions['z'] = None
        # explicitly resize a dimension and all variables using it
        f.resize_dimension('z', 3)

Notes:

- Automatic resizing of unlimited dimensions with array indexing is not available.
- Dimensions need to be manually resized with ``Group.resize_dimension(dimension, size)``.
- Arrays are returned padded with ``fillvalue`` (taken from underlying hdf5 dataset) up to
  current size of variable's dimensions. The behaviour is equivalent to netCDF4-python's
  ``Dataset.set_auto_mask(False)``.

Legacy API
~~~~~~~~~~

The legacy API is designed for compatibility with `netCDF4-python`_. To use it, import
``h5netcdf.legacyapi``:

.. _netCDF4-python: https://github.com/Unidata/netcdf4-python

.. code-block:: python

    import h5netcdf.legacyapi as netCDF4
    # everything here would also work with this instead:
    # import netCDF4
    import numpy as np

    with netCDF4.Dataset('mydata.nc', 'w') as ds:
        ds.createDimension('x', 5)
        v = ds.createVariable('hello', float, ('x',))
        v[:] = np.ones(5)

        g = ds.createGroup('grouped')
        g.createDimension('y', 10)
        g.createVariable('data', 'i8', ('y',))
        v = g['data']
        v[:] = np.arange(10)
        v.foo = 'bar'
        print(ds.groups['grouped'].variables['data'])

The legacy API is designed to be easy to try-out for netCDF4-python users, but it is not an
exact match. Here is an incomplete list of functionality we don't include:

- Utility functions ``chartostring``, ``num2date``, etc., that are not directly necessary
  for writing netCDF files.
- h5netcdf variables do not support automatic masking or scaling (e.g., of values matching
  the ``_FillValue`` attribute). We prefer to leave this functionality to client libraries
  (e.g., `xarray`_), which can implement their exact desired scaling behavior. Nevertheless
  arrays are returned padded with ``fillvalue`` (taken from underlying hdf5 dataset) up to
  current size of variable's dimensions. The behaviour is equivalent to netCDF4-python's
  ``Dataset.set_auto_mask(False)``.

.. _invalid netcdf:

Invalid netCDF files
~~~~~~~~~~~~~~~~~~~~

h5py implements some features that do not (yet) result in valid netCDF files:

- Data types:
    - Booleans
    - Complex values
    - Non-string variable length types
    - Enum types
    - Reference types
- Arbitrary filters:
    - Scale-offset filters

By default [#]_, h5netcdf will not allow writing files using any of these features,
as files with such features are not readable by other netCDF tools.

However, these are still valid HDF5 files. If you don't care about netCDF
compatibility, you can use these features by setting ``invalid_netcdf=True``
when creating a file:

.. code-block:: python

  # avoid the .nc extension for non-netcdf files
  f = h5netcdf.File('mydata.h5', invalid_netcdf=True)
  ...

  # works with the legacy API, too, though compression options are not exposed
  ds = h5netcdf.legacyapi.Dataset('mydata.h5', invalid_netcdf=True)
  ...

In such cases the `_NCProperties` attribute will not be saved to the file or be removed
from an existing file. A warning will be issued if the file has `.nc`-extension.

.. rubric:: Footnotes

.. [#] h5netcdf we will raise ``h5netcdf.CompatibilityError``.

Decoding variable length strings
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

h5py 3.0 introduced `new behavior`_ for handling variable length string.
Instead of being automatically decoded with UTF-8 into NumPy arrays of ``str``,
they are required as arrays of ``bytes``.

The legacy API preserves the old behavior of h5py (which matches netCDF4),
and automatically decodes strings.

The new API matches h5py behavior. Explicitly set ``decode_vlen_strings=True``
in the ``h5netcdf.File`` constructor to opt-in to automatic decoding.

.. _new behavior: https://docs.h5py.org/en/stable/strings.html

.. _phony dims:

Datasets with missing dimension scales
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

By default [#]_ h5netcdf raises a ``ValueError`` if variables with no dimension
scale associated with one of their axes are accessed.
You can set ``phony_dims='sort'`` when opening a file to let h5netcdf invent
phony dimensions according to `netCDF`_ behaviour.

.. code-block:: python

  # mimic netCDF-behaviour for non-netcdf files
  f = h5netcdf.File('mydata.h5', mode='r', phony_dims='sort')
  ...

Note, that this iterates once over the whole group-hierarchy. This has affects
on performance in case you rely on laziness of group access.
You can set ``phony_dims='access'`` instead to defer phony dimension creation
to group access time. The created phony dimension naming will differ from
`netCDF`_ behaviour.

.. code-block:: python

  f = h5netcdf.File('mydata.h5', mode='r', phony_dims='access')
  ...

.. rubric:: Footnotes

.. [#] Keyword default setting ``phony_dims=None`` for backwards compatibility.

.. _netCDF: https://docs.unidata.ucar.edu/netcdf-c/current/interoperability_hdf5.html

Track Order
~~~~~~~~~~~

As of h5netcdf 1.1.0, if h5py 3.7.0 or greater is detected, the ``track_order``
parameter is set to ``True`` enabling `order tracking`_ for newly created
netCDF4 files. This helps ensure that files created with the h5netcdf library
can be modified by the netCDF4-c and netCDF4-python implementation used in
other software stacks. Since this change should be transparent to most users,
it was made without deprecation.

Since track_order is set at creation time, any dataset that was created with
``track_order=False`` (h5netcdf version 1.0.2 and older except for 0.13.0) will
continue to opened with order tracker disabled.

The following describes the behavior of h5netcdf with respect to order tracking
for a few key versions:

- Version 0.12.0 and earlier, the ``track_order`` parameter`order was missing
  and thus order tracking was implicitely set to ``False``.
- Version 0.13.0 enabled order tracking by setting the parameter
  ``track_order`` to ``True`` by default without deprecation.
- Versions 0.13.1 to 1.0.2 set ``track_order`` to ``False`` due to a bug in a
  core dependency of h5netcdf, h5py `upstream bug`_ which was resolved in h5py
  3.7.0 with the help of the h5netcdf team.
- In version 1.1.0, if h5py 3.7.0 or above is detected, the ``track_order``
  parameter is set to ``True`` by default.


.. _order tracking: https://docs.unidata.ucar.edu/netcdf-c/current/file_format_specifications.html#creation_order
.. _upstream bug: https://github.com/h5netcdf/h5netcdf/issues/136
.. _[*]: https://github.com/h5netcdf/h5netcdf/issues/128

.. changelog

Changelog
---------

`Changelog`_

.. _Changelog: https://github.com/h5netcdf/h5netcdf/blob/main/CHANGELOG.rst

.. license

License
-------

`3-clause BSD`_

.. _3-clause BSD: https://github.com/h5netcdf/h5netcdf/blob/main/LICENSE

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "h5netcdf",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "h5netcdf developers <devteam@h5netcdf.org>",
    "keywords": "",
    "author": "",
    "author_email": "Stephan Hoyer <shoyer@gmail.com>, Kai M\u00fchlbauer <kmuehlbauer@wradlib.org>",
    "download_url": "https://files.pythonhosted.org/packages/f9/ff/a2f5a19034d8f1594bc1bb4e59d07c0b233c5c66398920d7825719a44bc8/h5netcdf-1.2.0.tar.gz",
    "platform": null,
    "description": "h5netcdf\n========\n\n.. image:: https://github.com/h5netcdf/h5netcdf/workflows/CI/badge.svg\n    :target: https://github.com/h5netcdf/h5netcdf/actions\n.. image:: https://badge.fury.io/py/h5netcdf.svg\n    :target: https://pypi.org/project/h5netcdf/\n.. image:: https://github.com/h5netcdf/h5netcdf/actions/workflows/pages/pages-build-deployment/badge.svg?branch=gh-pages\n    :target: https://h5netcdf.github.io/h5netcdf/\n\nA Python interface for the `netCDF4`_ file-format that reads and writes local or\nremote HDF5 files directly via `h5py`_ or `h5pyd`_, without relying on the Unidata\nnetCDF library.\n\n.. _netCDF4: https://docs.unidata.ucar.edu/netcdf-c/current/file_format_specifications.html#netcdf_4_spec\n.. _h5py: https://www.h5py.org/\n.. _h5pyd: https://github.com/HDFGroup/h5pyd\n\n\n.. why-h5netcdf\n\nWhy h5netcdf?\n-------------\n\n- It has one less binary dependency (netCDF C). If you already have h5py\n  installed, reading netCDF4 with h5netcdf may be much easier than installing\n  netCDF4-Python.\n- We've seen occasional reports of better performance with h5py than\n  netCDF4-python, though in many cases performance is identical. For\n  `one workflow`_, h5netcdf was reported to be almost **4x faster** than\n  `netCDF4-python`_.\n- Anecdotally, HDF5 users seem to be unexcited about switching to netCDF --\n  hopefully this will convince them that netCDF4 is actually quite sane!\n- Finally, side-stepping the netCDF C library (and Cython bindings to it)\n  gives us an easier way to identify the source of performance issues and\n  bugs in the netCDF libraries/specification.\n\n.. _one workflow: https://github.com/Unidata/netcdf4-python/issues/390#issuecomment-93864839\n.. _xarray: https://github.com/pydata/xarray/\n\nInstall\n-------\n\nEnsure you have a recent version of h5py installed (I recommend using `conda`_ or\nthe community effort `conda-forge`_).\nAt least version 3.0 is required. Then::\n\n    $ pip install h5netcdf\n\nOr if you are already using conda::\n\n    $ conda install h5netcdf\n\nNote:\n\nFrom version 1.2. h5netcdf tries to align with a `nep29`_-like support policy with regard\nto it's upstream dependencies.\n\n.. _conda: https://conda.io/\n.. _conda-forge: https://conda-forge.org/\n.. _nep29: https://numpy.org/neps/nep-0029-deprecation_policy.html\n\nUsage\n-----\n\nh5netcdf has two APIs, a new API and a legacy API. Both interfaces currently\nreproduce most of the features of the netCDF interface, with the notable\nexception of support for operations that rename or delete existing objects.\nWe simply haven't gotten around to implementing this yet. Patches\nwould be very welcome.\n\nNew API\n~~~~~~~\n\nThe new API supports direct hierarchical access of variables and groups. Its\ndesign is an adaptation of h5py to the netCDF data model. For example:\n\n.. code-block:: python\n\n    import h5netcdf\n    import numpy as np\n\n    with h5netcdf.File('mydata.nc', 'w') as f:\n        # set dimensions with a dictionary\n        f.dimensions = {'x': 5}\n        # and update them with a dict-like interface\n        # f.dimensions['x'] = 5\n        # f.dimensions.update({'x': 5})\n\n        v = f.create_variable('hello', ('x',), float)\n        v[:] = np.ones(5)\n\n        # you don't need to create groups first\n        # you also don't need to create dimensions first if you supply data\n        # with the new variable\n        v = f.create_variable('/grouped/data', ('y',), data=np.arange(10))\n\n        # access and modify attributes with a dict-like interface\n        v.attrs['foo'] = 'bar'\n\n        # you can access variables and groups directly using a hierarchical\n        # keys like h5py\n        print(f['/grouped/data'])\n\n        # add an unlimited dimension\n        f.dimensions['z'] = None\n        # explicitly resize a dimension and all variables using it\n        f.resize_dimension('z', 3)\n\nNotes:\n\n- Automatic resizing of unlimited dimensions with array indexing is not available.\n- Dimensions need to be manually resized with ``Group.resize_dimension(dimension, size)``.\n- Arrays are returned padded with ``fillvalue`` (taken from underlying hdf5 dataset) up to\n  current size of variable's dimensions. The behaviour is equivalent to netCDF4-python's\n  ``Dataset.set_auto_mask(False)``.\n\nLegacy API\n~~~~~~~~~~\n\nThe legacy API is designed for compatibility with `netCDF4-python`_. To use it, import\n``h5netcdf.legacyapi``:\n\n.. _netCDF4-python: https://github.com/Unidata/netcdf4-python\n\n.. code-block:: python\n\n    import h5netcdf.legacyapi as netCDF4\n    # everything here would also work with this instead:\n    # import netCDF4\n    import numpy as np\n\n    with netCDF4.Dataset('mydata.nc', 'w') as ds:\n        ds.createDimension('x', 5)\n        v = ds.createVariable('hello', float, ('x',))\n        v[:] = np.ones(5)\n\n        g = ds.createGroup('grouped')\n        g.createDimension('y', 10)\n        g.createVariable('data', 'i8', ('y',))\n        v = g['data']\n        v[:] = np.arange(10)\n        v.foo = 'bar'\n        print(ds.groups['grouped'].variables['data'])\n\nThe legacy API is designed to be easy to try-out for netCDF4-python users, but it is not an\nexact match. Here is an incomplete list of functionality we don't include:\n\n- Utility functions ``chartostring``, ``num2date``, etc., that are not directly necessary\n  for writing netCDF files.\n- h5netcdf variables do not support automatic masking or scaling (e.g., of values matching\n  the ``_FillValue`` attribute). We prefer to leave this functionality to client libraries\n  (e.g., `xarray`_), which can implement their exact desired scaling behavior. Nevertheless\n  arrays are returned padded with ``fillvalue`` (taken from underlying hdf5 dataset) up to\n  current size of variable's dimensions. The behaviour is equivalent to netCDF4-python's\n  ``Dataset.set_auto_mask(False)``.\n\n.. _invalid netcdf:\n\nInvalid netCDF files\n~~~~~~~~~~~~~~~~~~~~\n\nh5py implements some features that do not (yet) result in valid netCDF files:\n\n- Data types:\n    - Booleans\n    - Complex values\n    - Non-string variable length types\n    - Enum types\n    - Reference types\n- Arbitrary filters:\n    - Scale-offset filters\n\nBy default [#]_, h5netcdf will not allow writing files using any of these features,\nas files with such features are not readable by other netCDF tools.\n\nHowever, these are still valid HDF5 files. If you don't care about netCDF\ncompatibility, you can use these features by setting ``invalid_netcdf=True``\nwhen creating a file:\n\n.. code-block:: python\n\n  # avoid the .nc extension for non-netcdf files\n  f = h5netcdf.File('mydata.h5', invalid_netcdf=True)\n  ...\n\n  # works with the legacy API, too, though compression options are not exposed\n  ds = h5netcdf.legacyapi.Dataset('mydata.h5', invalid_netcdf=True)\n  ...\n\nIn such cases the `_NCProperties` attribute will not be saved to the file or be removed\nfrom an existing file. A warning will be issued if the file has `.nc`-extension.\n\n.. rubric:: Footnotes\n\n.. [#] h5netcdf we will raise ``h5netcdf.CompatibilityError``.\n\nDecoding variable length strings\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\nh5py 3.0 introduced `new behavior`_ for handling variable length string.\nInstead of being automatically decoded with UTF-8 into NumPy arrays of ``str``,\nthey are required as arrays of ``bytes``.\n\nThe legacy API preserves the old behavior of h5py (which matches netCDF4),\nand automatically decodes strings.\n\nThe new API matches h5py behavior. Explicitly set ``decode_vlen_strings=True``\nin the ``h5netcdf.File`` constructor to opt-in to automatic decoding.\n\n.. _new behavior: https://docs.h5py.org/en/stable/strings.html\n\n.. _phony dims:\n\nDatasets with missing dimension scales\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\nBy default [#]_ h5netcdf raises a ``ValueError`` if variables with no dimension\nscale associated with one of their axes are accessed.\nYou can set ``phony_dims='sort'`` when opening a file to let h5netcdf invent\nphony dimensions according to `netCDF`_ behaviour.\n\n.. code-block:: python\n\n  # mimic netCDF-behaviour for non-netcdf files\n  f = h5netcdf.File('mydata.h5', mode='r', phony_dims='sort')\n  ...\n\nNote, that this iterates once over the whole group-hierarchy. This has affects\non performance in case you rely on laziness of group access.\nYou can set ``phony_dims='access'`` instead to defer phony dimension creation\nto group access time. The created phony dimension naming will differ from\n`netCDF`_ behaviour.\n\n.. code-block:: python\n\n  f = h5netcdf.File('mydata.h5', mode='r', phony_dims='access')\n  ...\n\n.. rubric:: Footnotes\n\n.. [#] Keyword default setting ``phony_dims=None`` for backwards compatibility.\n\n.. _netCDF: https://docs.unidata.ucar.edu/netcdf-c/current/interoperability_hdf5.html\n\nTrack Order\n~~~~~~~~~~~\n\nAs of h5netcdf 1.1.0, if h5py 3.7.0 or greater is detected, the ``track_order``\nparameter is set to ``True`` enabling `order tracking`_ for newly created\nnetCDF4 files. This helps ensure that files created with the h5netcdf library\ncan be modified by the netCDF4-c and netCDF4-python implementation used in\nother software stacks. Since this change should be transparent to most users,\nit was made without deprecation.\n\nSince track_order is set at creation time, any dataset that was created with\n``track_order=False`` (h5netcdf version 1.0.2 and older except for 0.13.0) will\ncontinue to opened with order tracker disabled.\n\nThe following describes the behavior of h5netcdf with respect to order tracking\nfor a few key versions:\n\n- Version 0.12.0 and earlier, the ``track_order`` parameter`order was missing\n  and thus order tracking was implicitely set to ``False``.\n- Version 0.13.0 enabled order tracking by setting the parameter\n  ``track_order`` to ``True`` by default without deprecation.\n- Versions 0.13.1 to 1.0.2 set ``track_order`` to ``False`` due to a bug in a\n  core dependency of h5netcdf, h5py `upstream bug`_ which was resolved in h5py\n  3.7.0 with the help of the h5netcdf team.\n- In version 1.1.0, if h5py 3.7.0 or above is detected, the ``track_order``\n  parameter is set to ``True`` by default.\n\n\n.. _order tracking: https://docs.unidata.ucar.edu/netcdf-c/current/file_format_specifications.html#creation_order\n.. _upstream bug: https://github.com/h5netcdf/h5netcdf/issues/136\n.. _[*]: https://github.com/h5netcdf/h5netcdf/issues/128\n\n.. changelog\n\nChangelog\n---------\n\n`Changelog`_\n\n.. _Changelog: https://github.com/h5netcdf/h5netcdf/blob/main/CHANGELOG.rst\n\n.. license\n\nLicense\n-------\n\n`3-clause BSD`_\n\n.. _3-clause BSD: https://github.com/h5netcdf/h5netcdf/blob/main/LICENSE\n",
    "bugtrack_url": null,
    "license": "Copyright (c) 2015, h5netcdf developers All rights reserved.  Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:  1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.  2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.  3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.  THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ",
    "summary": "netCDF4 via h5py",
    "version": "1.2.0",
    "project_urls": {
        "changelog": "https://github.com/h5netcdf/h5netcdf/blob/main/CHANGELOG.rst",
        "documentation": "https://h5netcdf.org",
        "homepage": "https://h5netcdf.org",
        "repository": "https://github.com/h5netcdf/h5netcdf"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d1118116d6f209c8588ceb1382fddb8820fc720330373d9bd1a09434d684dbde",
                "md5": "b7d9fdfaa716791c1fdf22ce602093af",
                "sha256": "aa53c39b94bcd4595a2e5a2f62f3fb4fb8a723b5ca0a05f2db352f014bcfe72c"
            },
            "downloads": -1,
            "filename": "h5netcdf-1.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b7d9fdfaa716791c1fdf22ce602093af",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 43266,
            "upload_time": "2023-06-02T05:40:42",
            "upload_time_iso_8601": "2023-06-02T05:40:42.360087Z",
            "url": "https://files.pythonhosted.org/packages/d1/11/8116d6f209c8588ceb1382fddb8820fc720330373d9bd1a09434d684dbde/h5netcdf-1.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f9ffa2f5a19034d8f1594bc1bb4e59d07c0b233c5c66398920d7825719a44bc8",
                "md5": "b33b8a352c56900f51a43fdae0d6db68",
                "sha256": "7f6b2733bde06ea2575b79a6450d9bd5c38918ff4cb2a355bf22bbe8c86c6bcf"
            },
            "downloads": -1,
            "filename": "h5netcdf-1.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "b33b8a352c56900f51a43fdae0d6db68",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 57221,
            "upload_time": "2023-06-02T05:40:45",
            "upload_time_iso_8601": "2023-06-02T05:40:45.435804Z",
            "url": "https://files.pythonhosted.org/packages/f9/ff/a2f5a19034d8f1594bc1bb4e59d07c0b233c5c66398920d7825719a44bc8/h5netcdf-1.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-02 05:40:45",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "h5netcdf",
    "github_project": "h5netcdf",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "h5netcdf"
}
        
Elapsed time: 0.07177s