Name | dh5io JSON |
Version |
0.2.1
JSON |
| download |
home_page | None |
Summary | Python tools for DAQ-HDF5 (dh5) file format used at Brain Research Institute of University of Bremen |
upload_time | 2025-08-21 10:03:18 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.11 |
license | None |
keywords |
data format
electrophysiology
hdf5
daq
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Python Tools for the DAQ-HDF5 format
A Python package for handling
[DAQ-HDF5](https://github.com/cog-neurophys-lab/DAQ-HDF5)(`*.dh5`) files. The DH5 format is
a hierarchical data format based on [HDF5](https://www.hdfgroup.org/solutions/hdf5/)
designed for storing and sharing neurophysiology data, used in the Brain Research Institute
of the University of Bremen since 2005.
[](https://github.com/cog-neurophys-lab/dh5io/actions/workflows/python-tests.yml)
- **`dhspec`** contains the specification of the DAQ-HDF5 file format as Python code.
- **`dh5io`** contains code for reading, writing and validating HDF5 files containing data
according to the DAQ-HDF5 specfication.
- **`dh5neo` (WIP)** contains code for reading DAQ-HDF5 data into
[Neo](https://github.com/NeuralEnsemble/python-neo) objects (e.g. for use with [Elephant](https://elephant.readthedocs.io/en/latest/index.html), [SpikeInterface](https://spikeinterface.readthedocs.io) and [ephyviewer](https://ephyviewer.readthedocs.io/)
## Getting started
### Installation
Install the package using uv (recommended):
```bash
uv pip install dh5io
```
Or with pip:
```bash
pip install dh5io
```
### Reading and writing from and into DH5 files
```python
from dh5io.dh5file import DH5File
with DH5File(example_filename, "r") as dh5:
# inspect file content
print(dh5)
cont = dh5.get_cont_group_by_id(1) # Get CONT group with id 1
print(cont)
trialmap = dh5.get_trialmap()
print(trialmap)
```
```
DAQ-HDF5 File (version 2) <example_filename> containing:
├───CONT Groups (7):
│ ├─── CONT1
│ ├─── CONT60
│ ├─── CONT61
│ ├─── CONT62
│ ├─── CONT63
│ ├─── CONT64
│ └─── CONT1001
├───SPIKE Groups (1):
│ └─── SPIKE0
├─── 10460 Events
└─── 385 Trials in TRIALMAP
/CONT1 in <example_filename>
├─── id: 1
├─── name:
├─── comment:
├─── sample_period: 1000000 ns (1000.0 Hz)
├─── n_channels: 1
├─── n_samples: 1443184
├─── duration: 3021.76 s
├─── n_regions: 385
├─── signal_type: None
├─── calibration: [1.0172526e-07]
├─── data: (1443184, 1)
└─── index: (385,)
```
This example shows how to open a DH5 file, inspect its content, and retrieve a specific CONT
group. The `DH5File` class provides methods for accessing the various groups and datasets
within the file. The `Cont`, `Spike` (coming in next versions) and `Trialmap` classes
provide convenient wrappers for working with these raw HDF5 groups and datasets. The
corresponding [h5py](https://docs.h5py.org/en/stable/index.html) classes can be accessed
directly for lower-level operations using the `_file`, `_group` and `_dataset` attributes
(e.g. `cont._group` or `cont.data._dataset`).
As an alternative to the object-oriented approach using `DH5File`, you can use the
functional API provided by the library. This API offers a set of functions for reading and
writing data to DH5 files without the need to create file objects. These functions in the
respective modules (`h5io.cont`, `h5io.spike`, etc.) use the
[h5py](https://docs.h5py.org/en/stable/index.html) classes as input and output. This is the
recommended way if you are familiar with HDF5 and the specification of the DH5 format.
## Developer setup
- Use [uv](https://docs.astral.sh/uv)
- Setup pre-push hook for running pytest
```bash
git config --local core.hooksPath .githooks
```
Raw data
{
"_id": null,
"home_page": null,
"name": "dh5io",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": "data format, electrophysiology, hdf5, daq",
"author": null,
"author_email": "Joscha Schmiedt <schmiedt@brain.uni-bremen.de>",
"download_url": "https://files.pythonhosted.org/packages/1c/17/3f718363f0d5b7b5a99ccb2349632afaecf586280183e568aafff4b612ee/dh5io-0.2.1.tar.gz",
"platform": null,
"description": "# Python Tools for the DAQ-HDF5 format\r\n\r\nA Python package for handling\r\n[DAQ-HDF5](https://github.com/cog-neurophys-lab/DAQ-HDF5)(`*.dh5`) files. The DH5 format is\r\na hierarchical data format based on [HDF5](https://www.hdfgroup.org/solutions/hdf5/)\r\ndesigned for storing and sharing neurophysiology data, used in the Brain Research Institute\r\nof the University of Bremen since 2005.\r\n\r\n[](https://github.com/cog-neurophys-lab/dh5io/actions/workflows/python-tests.yml)\r\n\r\n- **`dhspec`** contains the specification of the DAQ-HDF5 file format as Python code.\r\n- **`dh5io`** contains code for reading, writing and validating HDF5 files containing data\r\n according to the DAQ-HDF5 specfication.\r\n- **`dh5neo` (WIP)** contains code for reading DAQ-HDF5 data into\r\n [Neo](https://github.com/NeuralEnsemble/python-neo) objects (e.g. for use with [Elephant](https://elephant.readthedocs.io/en/latest/index.html), [SpikeInterface](https://spikeinterface.readthedocs.io) and [ephyviewer](https://ephyviewer.readthedocs.io/)\r\n\r\n## Getting started \r\n\r\n\r\n### Installation\r\n\r\nInstall the package using uv (recommended):\r\n\r\n```bash\r\nuv\u00a0pip\u00a0install\u00a0dh5io\r\n```\r\n\r\nOr with pip:\r\n\r\n```bash\r\npip\u00a0install\u00a0dh5io\r\n```\r\n\r\n\r\n### Reading and writing from and into DH5 files\r\n\r\n\r\n```python\r\nfrom dh5io.dh5file import DH5File\r\n\r\nwith DH5File(example_filename, \"r\") as dh5:\r\n # inspect file content\r\n print(dh5)\r\n\r\n cont = dh5.get_cont_group_by_id(1) # Get CONT group with id 1\r\n print(cont)\r\n\r\n trialmap = dh5.get_trialmap()\r\n print(trialmap)\r\n```\r\n\r\n\r\n```\r\n DAQ-HDF5 File (version 2) <example_filename> containing:\r\n \u251c\u2500\u2500\u2500CONT Groups (7):\r\n \u2502 \u251c\u2500\u2500\u2500 CONT1\r\n \u2502 \u251c\u2500\u2500\u2500 CONT60\r\n \u2502 \u251c\u2500\u2500\u2500 CONT61\r\n \u2502 \u251c\u2500\u2500\u2500 CONT62\r\n \u2502 \u251c\u2500\u2500\u2500 CONT63\r\n \u2502 \u251c\u2500\u2500\u2500 CONT64\r\n \u2502 \u2514\u2500\u2500\u2500 CONT1001\r\n \u251c\u2500\u2500\u2500SPIKE Groups (1):\r\n \u2502 \u2514\u2500\u2500\u2500 SPIKE0\r\n \u251c\u2500\u2500\u2500 10460 Events\r\n \u2514\u2500\u2500\u2500 385 Trials in TRIALMAP\r\n\r\n /CONT1 in <example_filename>\r\n \u251c\u2500\u2500\u2500 id: 1\r\n \u251c\u2500\u2500\u2500 name: \r\n \u251c\u2500\u2500\u2500 comment: \r\n \u251c\u2500\u2500\u2500 sample_period: 1000000 ns (1000.0 Hz)\r\n \u251c\u2500\u2500\u2500 n_channels: 1\r\n \u251c\u2500\u2500\u2500 n_samples: 1443184\r\n \u251c\u2500\u2500\u2500 duration: 3021.76 s\r\n \u251c\u2500\u2500\u2500 n_regions: 385\r\n \u251c\u2500\u2500\u2500 signal_type: None\r\n \u251c\u2500\u2500\u2500 calibration: [1.0172526e-07]\r\n \u251c\u2500\u2500\u2500 data: (1443184, 1)\r\n \u2514\u2500\u2500\u2500 index: (385,)\r\n```\r\n\r\nThis example shows how to open a DH5 file, inspect its content, and retrieve a specific CONT\r\ngroup. The `DH5File` class provides methods for accessing the various groups and datasets\r\nwithin the file. The `Cont`, `Spike` (coming in next versions) and `Trialmap` classes\r\nprovide convenient wrappers for working with these raw HDF5 groups and datasets. The\r\ncorresponding [h5py](https://docs.h5py.org/en/stable/index.html) classes can be accessed\r\ndirectly for lower-level operations using the `_file`, `_group` and `_dataset` attributes\r\n(e.g. `cont._group` or `cont.data._dataset`).\r\n\r\nAs an alternative to the object-oriented approach using `DH5File`, you can use the\r\nfunctional API provided by the library. This API offers a set of functions for reading and\r\nwriting data to DH5 files without the need to create file objects. These functions in the\r\nrespective modules (`h5io.cont`, `h5io.spike`, etc.) use the\r\n[h5py](https://docs.h5py.org/en/stable/index.html) classes as input and output. This is the\r\nrecommended way if you are familiar with HDF5 and the specification of the DH5 format.\r\n\r\n\r\n## Developer setup\r\n\r\n- Use [uv](https://docs.astral.sh/uv)\r\n- Setup pre-push hook for running pytest \r\n ```bash\r\n git config --local core.hooksPath .githooks\r\n ```\r\n",
"bugtrack_url": null,
"license": null,
"summary": "Python tools for DAQ-HDF5 (dh5) file format used at Brain Research Institute of University of Bremen",
"version": "0.2.1",
"project_urls": null,
"split_keywords": [
"data format",
" electrophysiology",
" hdf5",
" daq"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "47e70a018686922e3c4730f3afe8b84c5de7679a0891d6162a9c4042932c2dfb",
"md5": "bc1c3d6b30df3c286745126e991fb02d",
"sha256": "508a482f879f7b72a9e8cd3f2147244ceab10eeacbf00ed7c24036eb5fc47597"
},
"downloads": -1,
"filename": "dh5io-0.2.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "bc1c3d6b30df3c286745126e991fb02d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 43342,
"upload_time": "2025-08-21T10:03:17",
"upload_time_iso_8601": "2025-08-21T10:03:17.245652Z",
"url": "https://files.pythonhosted.org/packages/47/e7/0a018686922e3c4730f3afe8b84c5de7679a0891d6162a9c4042932c2dfb/dh5io-0.2.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "1c173f718363f0d5b7b5a99ccb2349632afaecf586280183e568aafff4b612ee",
"md5": "bc85275fd54d557b5e2f4be49179942e",
"sha256": "7439ce863c60998b381d6ec6cc990102c03d060c3e7f9285cfa1db5771705d2c"
},
"downloads": -1,
"filename": "dh5io-0.2.1.tar.gz",
"has_sig": false,
"md5_digest": "bc85275fd54d557b5e2f4be49179942e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 41326,
"upload_time": "2025-08-21T10:03:18",
"upload_time_iso_8601": "2025-08-21T10:03:18.145752Z",
"url": "https://files.pythonhosted.org/packages/1c/17/3f718363f0d5b7b5a99ccb2349632afaecf586280183e568aafff4b612ee/dh5io-0.2.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-21 10:03:18",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "dh5io"
}