arf


Namearf JSON
Version 2.6.7 PyPI version JSON
download
home_pagehttps://github.com/melizalab/arf
SummaryAdvanced Recording Format for acoustic, behavioral, and physiological data
upload_time2024-01-13 22:25:44
maintainerDan Meliza
docs_urlNone
authorDan Meliza
requires_python>=3.7
licenseBSD 3-Clause License
keywords neuroscience data format
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            arf
---

|ProjectStatus|_ |Version|_ |BuildStatus|_ |License|_ |PythonVersions|_

.. |ProjectStatus| image:: https://www.repostatus.org/badges/latest/active.svg
.. _ProjectStatus: https://www.repostatus.org/#active

.. |Version| image:: https://img.shields.io/pypi/v/arf.svg
.. _Version: https://pypi.python.org/pypi/arf/

.. |BuildStatus| image:: https://github.com/melizalab/arf/actions/workflows/tests-python.yml/badge.svg
.. _BuildStatus: https://github.com/melizalab/arf/actions/workflows/tests-python.yml

.. |License| image:: https://img.shields.io/pypi/l/arf.svg
.. _License: https://opensource.org/license/bsd-3-clause/

.. |PythonVersions| image:: https://img.shields.io/pypi/pyversions/arf.svg
.. _PythonVersions: https://pypi.python.org/pypi/arf/


The Advanced Recording Format `ARF <https://meliza.org/spec:1/arf/>`__
is an open standard for storing data from neuronal, acoustic, and
behavioral experiments in a portable, high-performance, archival format.
The goal is to enable labs to share data and tools, and to allow
valuable data to be accessed and analyzed for many years in the future.

**ARF** is built on the the `HDF5 <http://www.hdfgroup.org/HDF5/>`__
format, and all arf files are accessible through standard HDF5 tools,
including interfaces to HDF5 written for other languages (e.g. MATLAB,
Python, etc). **ARF** comprises a set of specifications on how different
kinds of data are stored. The organization of ARF files is based around
the concept of an *entry*, a collection of data channels associated with
a particular point in time. An entry might contain one or more of the
following:

-  raw extracellular neural signals recorded from a multichannel probe
-  spike times extracted from neural data
-  acoustic signals from a microphone
-  times when an animal interacted with a behavioral apparatus
-  the times when a real-time signal analyzer detected vocalization

Entries and datasets have metadata attributes describing how the data
were collected. Datasets and entries retain these attributes when copied
or moved between arf files, helping to prevent data from becoming
orphaned and uninterpretable.

This repository contains:

-  The specification for arf (in specification.md). This is also hosted
   at https://meliza.org/spec:1/arf/.
-  A fast, type-safe C++ interface for reading and writing arf files
-  A python interface for reading and writing arf files (based on h5py).

contributing
~~~~~~~~~~~~

ARF is under active development and we welcome comments and
contributions from neuroscientists and behavioral biologists interested
in using it. We’re particularly interested in use cases that don’t fit
the current specification. Please post issues or contact Dan Meliza (dan
at meliza.org) directly.

The MATLAB interface is out of date and could use some work.

installation
~~~~~~~~~~~~

ARF files require HDF5>=1.8 (http://www.hdfgroup.org/HDF5).

The python interface requires Python 3.7 or greater, numpy>=1.19, and
h5py>=2.10. The last version to support Python 2 was ``2.5.1``. To
install the module:

.. code:: bash

   pip install arf

To use the C++ interface, you need boost>=1.42 (http://boost.org). In
addition, if writing multithreaded code, HDF5 needs to be compiled with
``--enable-threadsafe``. The interface is header-only and does not need
to be compiled. To install:

.. code:: bash

   make install

version information
~~~~~~~~~~~~~~~~~~~

The specification and implementations provided in this project use a
form of semantic versioning (http://semver.org). Specifications receive
a major and minor version number. Changes to minor version numbers must
be backwards compatible (i.e., only added requirements). The current
released version of the ARF specification is ``2.1``.

Implementation versions are synchronized with the major version of the
specification but otherwise evolve independently. For example, the
python ``arf`` package version ``2.1.0`` is compatible with any ARF
version ``2.x``.

There was no public release of ARF prior to ``2.0``.

access ARF files with HDF5 tools
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

This section describes how to inspect ARF files using standard tools, in
the event that the interfaces described here cease to function.

The structure of an ARF file can be explored using the ``h5ls`` tool.
For example, to list entries:

.. code:: bash

   $ h5ls file.arf
   test_0001                Group
   test_0002                Group
   test_0003                Group
   test_0004                Group

Each entry appears as a Group. To list the contents of an entry, use
path notation:

.. code:: bash

   $ h5ls file.arf/test_0001
   pcm                      Dataset {609914}

This shows that the data in ``test_0001`` is stored in a single node,
``pcm``}, with 609914 data points. Typically each channel will have its
own dataset.

The ``h5dump`` command can be used to output data in binary format. See
the HDF5 documentation for details on how to structure the output. For
example, to extract sampled data to a 16-bit little-endian file (i.e.,
PCM format):

.. code:: bash

   h5dump -d /test_0001/pcm -b LE -o test_0001.pcm file.arf

related projects
~~~~~~~~~~~~~~~~

-  `arfx <https://github.com/melizalab/arfx>`__ is a commandline tool
   for manipulating ARF files.

open data formats
^^^^^^^^^^^^^^^^^

-  `neurodata without borders <http://www.nwb.org>`__ has similar goals
   and also uses HDF5 for storage. The data schema is considerably more
   complex, but it does seem to be achieving growing adoption.
-  `pandora <https://github.com/G-Node/pandora>`__ is also under active
   development

i/o libraries
^^^^^^^^^^^^^

-  `neo <https://github.com/NeuralEnsemble/python-neo>`__ is a Python
   package for working with electrophysiology data in Python, together
   with support for reading a wide range of neurophysiology file
   formats.
-  `neuroshare <http://neuroshare.org>`__ is a set of routines for
   reading and writing data in various proprietary and open formats.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/melizalab/arf",
    "name": "arf",
    "maintainer": "Dan Meliza",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "dan@meliza.org",
    "keywords": "neuroscience,data format",
    "author": "Dan Meliza",
    "author_email": "dan@meliza.org",
    "download_url": "https://files.pythonhosted.org/packages/01/1c/d689f13bc7362d470e31a001b4ec04454f72aec99dffb4cb38f5fb72fbed/arf-2.6.7.tar.gz",
    "platform": null,
    "description": "arf\n---\n\n|ProjectStatus|_ |Version|_ |BuildStatus|_ |License|_ |PythonVersions|_\n\n.. |ProjectStatus| image:: https://www.repostatus.org/badges/latest/active.svg\n.. _ProjectStatus: https://www.repostatus.org/#active\n\n.. |Version| image:: https://img.shields.io/pypi/v/arf.svg\n.. _Version: https://pypi.python.org/pypi/arf/\n\n.. |BuildStatus| image:: https://github.com/melizalab/arf/actions/workflows/tests-python.yml/badge.svg\n.. _BuildStatus: https://github.com/melizalab/arf/actions/workflows/tests-python.yml\n\n.. |License| image:: https://img.shields.io/pypi/l/arf.svg\n.. _License: https://opensource.org/license/bsd-3-clause/\n\n.. |PythonVersions| image:: https://img.shields.io/pypi/pyversions/arf.svg\n.. _PythonVersions: https://pypi.python.org/pypi/arf/\n\n\nThe Advanced Recording Format `ARF <https://meliza.org/spec:1/arf/>`__\nis an open standard for storing data from neuronal, acoustic, and\nbehavioral experiments in a portable, high-performance, archival format.\nThe goal is to enable labs to share data and tools, and to allow\nvaluable data to be accessed and analyzed for many years in the future.\n\n**ARF** is built on the the `HDF5 <http://www.hdfgroup.org/HDF5/>`__\nformat, and all arf files are accessible through standard HDF5 tools,\nincluding interfaces to HDF5 written for other languages (e.g.\u00a0MATLAB,\nPython, etc). **ARF** comprises a set of specifications on how different\nkinds of data are stored. The organization of ARF files is based around\nthe concept of an *entry*, a collection of data channels associated with\na particular point in time. An entry might contain one or more of the\nfollowing:\n\n-  raw extracellular neural signals recorded from a multichannel probe\n-  spike times extracted from neural data\n-  acoustic signals from a microphone\n-  times when an animal interacted with a behavioral apparatus\n-  the times when a real-time signal analyzer detected vocalization\n\nEntries and datasets have metadata attributes describing how the data\nwere collected. Datasets and entries retain these attributes when copied\nor moved between arf files, helping to prevent data from becoming\norphaned and uninterpretable.\n\nThis repository contains:\n\n-  The specification for arf (in specification.md). This is also hosted\n   at https://meliza.org/spec:1/arf/.\n-  A fast, type-safe C++ interface for reading and writing arf files\n-  A python interface for reading and writing arf files (based on h5py).\n\ncontributing\n~~~~~~~~~~~~\n\nARF is under active development and we welcome comments and\ncontributions from neuroscientists and behavioral biologists interested\nin using it. We\u2019re particularly interested in use cases that don\u2019t fit\nthe current specification. Please post issues or contact Dan Meliza (dan\nat meliza.org) directly.\n\nThe MATLAB interface is out of date and could use some work.\n\ninstallation\n~~~~~~~~~~~~\n\nARF files require HDF5>=1.8 (http://www.hdfgroup.org/HDF5).\n\nThe python interface requires Python 3.7 or greater, numpy>=1.19, and\nh5py>=2.10. The last version to support Python 2 was ``2.5.1``. To\ninstall the module:\n\n.. code:: bash\n\n   pip install arf\n\nTo use the C++ interface, you need boost>=1.42 (http://boost.org). In\naddition, if writing multithreaded code, HDF5 needs to be compiled with\n``--enable-threadsafe``. The interface is header-only and does not need\nto be compiled. To install:\n\n.. code:: bash\n\n   make install\n\nversion information\n~~~~~~~~~~~~~~~~~~~\n\nThe specification and implementations provided in this project use a\nform of semantic versioning (http://semver.org). Specifications receive\na major and minor version number. Changes to minor version numbers must\nbe backwards compatible (i.e., only added requirements). The current\nreleased version of the ARF specification is ``2.1``.\n\nImplementation versions are synchronized with the major version of the\nspecification but otherwise evolve independently. For example, the\npython ``arf`` package version ``2.1.0`` is compatible with any ARF\nversion ``2.x``.\n\nThere was no public release of ARF prior to ``2.0``.\n\naccess ARF files with HDF5 tools\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\nThis section describes how to inspect ARF files using standard tools, in\nthe event that the interfaces described here cease to function.\n\nThe structure of an ARF file can be explored using the ``h5ls`` tool.\nFor example, to list entries:\n\n.. code:: bash\n\n   $ h5ls file.arf\n   test_0001                Group\n   test_0002                Group\n   test_0003                Group\n   test_0004                Group\n\nEach entry appears as a Group. To list the contents of an entry, use\npath notation:\n\n.. code:: bash\n\n   $ h5ls file.arf/test_0001\n   pcm                      Dataset {609914}\n\nThis shows that the data in ``test_0001`` is stored in a single node,\n``pcm``}, with 609914 data points. Typically each channel will have its\nown dataset.\n\nThe ``h5dump`` command can be used to output data in binary format. See\nthe HDF5 documentation for details on how to structure the output. For\nexample, to extract sampled data to a 16-bit little-endian file (i.e.,\nPCM format):\n\n.. code:: bash\n\n   h5dump -d /test_0001/pcm -b LE -o test_0001.pcm file.arf\n\nrelated projects\n~~~~~~~~~~~~~~~~\n\n-  `arfx <https://github.com/melizalab/arfx>`__ is a commandline tool\n   for manipulating ARF files.\n\nopen data formats\n^^^^^^^^^^^^^^^^^\n\n-  `neurodata without borders <http://www.nwb.org>`__ has similar goals\n   and also uses HDF5 for storage. The data schema is considerably more\n   complex, but it does seem to be achieving growing adoption.\n-  `pandora <https://github.com/G-Node/pandora>`__ is also under active\n   development\n\ni/o libraries\n^^^^^^^^^^^^^\n\n-  `neo <https://github.com/NeuralEnsemble/python-neo>`__ is a Python\n   package for working with electrophysiology data in Python, together\n   with support for reading a wide range of neurophysiology file\n   formats.\n-  `neuroshare <http://neuroshare.org>`__ is a set of routines for\n   reading and writing data in various proprietary and open formats.\n",
    "bugtrack_url": null,
    "license": "BSD 3-Clause License",
    "summary": "Advanced Recording Format for acoustic, behavioral, and physiological data",
    "version": "2.6.7",
    "project_urls": {
        "Homepage": "https://github.com/melizalab/arf"
    },
    "split_keywords": [
        "neuroscience",
        "data format"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1a1e45985d47c813fa467a9c86c2791082afb913dbe8001324c9410b4770e6bb",
                "md5": "886184dc065f0d3003793bfb69e5fe71",
                "sha256": "6588e3faa2600dc964dcd57b5502e3776a2b724719d103440b8a8805f74d9508"
            },
            "downloads": -1,
            "filename": "arf-2.6.7-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "886184dc065f0d3003793bfb69e5fe71",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 16498,
            "upload_time": "2024-01-13T22:25:42",
            "upload_time_iso_8601": "2024-01-13T22:25:42.749165Z",
            "url": "https://files.pythonhosted.org/packages/1a/1e/45985d47c813fa467a9c86c2791082afb913dbe8001324c9410b4770e6bb/arf-2.6.7-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "011cd689f13bc7362d470e31a001b4ec04454f72aec99dffb4cb38f5fb72fbed",
                "md5": "125f7372a09a5db61811958a0492da2a",
                "sha256": "c4756b05fbffb35c5092536f7b6ca89f14634abbda53debcc94375f90000f6c1"
            },
            "downloads": -1,
            "filename": "arf-2.6.7.tar.gz",
            "has_sig": false,
            "md5_digest": "125f7372a09a5db61811958a0492da2a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 37520,
            "upload_time": "2024-01-13T22:25:44",
            "upload_time_iso_8601": "2024-01-13T22:25:44.591668Z",
            "url": "https://files.pythonhosted.org/packages/01/1c/d689f13bc7362d470e31a001b4ec04454f72aec99dffb4cb38f5fb72fbed/arf-2.6.7.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-13 22:25:44",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "melizalab",
    "github_project": "arf",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "arf"
}
        
Elapsed time: 0.18708s