# PyNeuroTrace: Python code for Neural Timeseries
[![Tests](https://github.com/padster/pyNeuroTrace/actions/workflows/python-test.yml/badge.svg)](https://github.com/padster/pyNeuroTrace/actions/workflows/python-test.yml) [![Documentation](https://github.com/padster/pyNeuroTrace/actions/workflows/sphinx.yml/badge.svg)](http://padster.github.io/pyNeuroTrace/)
## Installation
'pyNeuroTrace' can be installed with pip:
```
pip install pyNeuroTrace
```
GPU Supported functions use the Python Library Cupy. This library has the following requirements:
* NVIDIA CUDA GPU with the Compute Capability 3.0 or larger.
* CUDA Toolkit: v11.2 / v11.3 / v11.4 / v11.5 / v11.6 / v11.7 / v11.8 / v12.0 / v12.1 / v12.2 / v12.3 / v12.4
'pyNeuroTrace' can be installed with Cupy using pip:
```
pip install pyNeuroTrace[GPU]
```
If Cupy fails to build from the wheel using this command try installing Cupy first using a wheel that matches your CUDA Toolkit version ie:
```
pip install cupy-cuda12x
```
Followed by this command to install pyNeuroTrace
```
pip install pyNeuroTrace
```
For more information in on installing CuPy, see their [CuPy installation documentation](https://docs.cupy.dev/en/stable/install.html).
## Documentation
To help get started using 'pyNeuroTrace', full documentation of the software and available modules and methods is available at our [pyNeuroTrace github.io site](https://padster.github.io/pyNeuroTrace/).
## Vizualization
Probably the most useful section of pyNeuroTrace, there are a number of visualization functions provided to help display the trace data in easy to understand formats. For more details, and visual examples of what is available, please consult the [README](https://github.com/padster/pyNeuroTrace/tree/master/pyneurotrace) next to viz.py.
## Notebook utilities
Unrelated to neuron time series, but useful when using this regardless, PyNeuroTrace also provides a collection of tools for running analysis in Jupyter notebooks.
These include:
`notebook.filePicker` to selecting an existing file, `notebook.newFilePicker` for indicating a new path, and `notebook.folderPicker` for selecting an existing folder.
> These all open PyQT file dialogs, to make it easy for users to interact with the file system. Customisation options exist, for example prompt and default locations.
`showTabs(data, func, titles, progressBar=False)`
> Useful for performing analysis repeated multiple times across e.g. different neurons, experiments, conditions, ...etc. Given either a list, or dictionary, all items will be iterated over, and each individually drawn onto their own tab with the provided `func`.
The method provided expects `func(idx, key, value)`, where `idx` is the (0-based) index of the tab, `key` is the list/dictionary key, and `value` is what is to be processed.
## Processing Data
Common per-trace processing filters are provided within [filters.py](https://github.com/padster/pyNeuroTrace/tree/master/pyneurotrace/filters.py). These are all designed to take a numpy array of traces, with each row an independent trace, and all return a filtered array of the same size.
These include:
`filters.deltaFOverF0(traces, hz, t0, t1, t2)`
> Converts raw signal to the standard Delta-F-over-F0, using the technique given in [Jia et al, 2011](http://doi.org/10.1038/nprot.2010.169). The smoothing parameters (t0, t1, t2) are as described in the paper, all with units in seconds. Sample rate must also be provided to convert these to sample units.
`filters.okada(traces)`
> Reduces noise in traces by smoothing single peaks or valleys, as described in [Okada et al, 2016](https://doi.org/10.1371/journal.pone.0157595)
## Event Detection
Python implementations of the three algorithms discussed in our paper [Sakaki et al, 2018](https://doi.org/10.1109/EMBC.2018.8512983) for finding events within Calcium fluorescent traces.
`ewma(data, weight)`
> calculates the Exponentially-Weighted Moving Average for each trace, given how strongly to weight new points (vs. the previous average).
`cusum(data, slack)`
> calculates the Cumulative Sum for each trace, taking a slack parameter which controls how far from the mean the signal the signal needs to be to not be considered noise.
`matchedFilter(data, windowSize, A, tA, tB)`
> calculates the likelihood ratio for each sample to be the end of a window of expected transient shape, being a double exponential with amplitude A, rise-time tA, and decay-time tB (in samples).
The results of each of these three detection filters can then be passed through `thresholdEvents(data, threshold)`, to register detected events whenever the filter strength increases above the given threshold.
## Reading Data (lab-specific)
The code within this repository was designed to read data from experiments performed by the [Kurt Haas lab](https://cps.med.ubc.ca/faculty/haas/) at UBC. If you're from this lab, read below. If not, this part is probably not relevant, but fee free to ask if you'd be interested in loading your own data file formats.
A number of options for loading data files are available within [files.py](https://github.com/padster/pyNeuroTrace/tree/master/pyneurotrace/files.py), including:
* `load2PData(path)` takes an experiment output file (e.g STEP_5_EXPT.TXT) and returns ID, XYZ location, and raw intensity values for each node in the experiment.
* `loadMetadata(path)` takes a metadata file (e.g. rscan_metadata_step_5.txt) and returns the stimulus start/stop samples, as well as the sample rate for the experiment.
* `loadTreeStructure(path)` takes a tree structure file (e.g. interp-neuron-.txt) and returns the mapping of node IDs to tree information about that node (e.g. node type, children, ...).
Raw data
{
"_id": null,
"home_page": "https://github.com/padster/pyNeuroTrace",
"name": "pyNeuroTrace",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "neuron, analysis, timeseries, neuroscience",
"author": "Pat Coleman",
"author_email": "Pat Coleman <padsterpat@gmail.com>, Peter Hogg <peter.hogg@ubc.ca>",
"download_url": "https://files.pythonhosted.org/packages/5f/2b/e5b2c26278f74947df4740763f177957eb5e9a0f20a786e9ee38ecec0b8c/pyneurotrace-1.0.1.tar.gz",
"platform": null,
"description": "# PyNeuroTrace: Python code for Neural Timeseries\n[![Tests](https://github.com/padster/pyNeuroTrace/actions/workflows/python-test.yml/badge.svg)](https://github.com/padster/pyNeuroTrace/actions/workflows/python-test.yml) [![Documentation](https://github.com/padster/pyNeuroTrace/actions/workflows/sphinx.yml/badge.svg)](http://padster.github.io/pyNeuroTrace/)\n\n## Installation\n'pyNeuroTrace' can be installed with pip:\n\n```\npip install pyNeuroTrace\n```\n\nGPU Supported functions use the Python Library Cupy. This library has the following requirements:\n* NVIDIA CUDA GPU with the Compute Capability 3.0 or larger.\n\n* CUDA Toolkit: v11.2 / v11.3 / v11.4 / v11.5 / v11.6 / v11.7 / v11.8 / v12.0 / v12.1 / v12.2 / v12.3 / v12.4\n\n\n'pyNeuroTrace' can be installed with Cupy using pip:\n\n```\npip install pyNeuroTrace[GPU]\n```\nIf Cupy fails to build from the wheel using this command try installing Cupy first using a wheel that matches your CUDA Toolkit version ie:\n```\npip install cupy-cuda12x\n```\n\nFollowed by this command to install pyNeuroTrace\n```\npip install pyNeuroTrace\n```\n\nFor more information in on installing CuPy, see their [CuPy installation documentation](https://docs.cupy.dev/en/stable/install.html).\n\n## Documentation\n\nTo help get started using 'pyNeuroTrace', full documentation of the software and available modules and methods is available at our [pyNeuroTrace github.io site](https://padster.github.io/pyNeuroTrace/).\n\n## Vizualization\n\nProbably the most useful section of pyNeuroTrace, there are a number of visualization functions provided to help display the trace data in easy to understand formats. For more details, and visual examples of what is available, please consult the [README](https://github.com/padster/pyNeuroTrace/tree/master/pyneurotrace) next to viz.py.\n\n## Notebook utilities\n\nUnrelated to neuron time series, but useful when using this regardless, PyNeuroTrace also provides a collection of tools for running analysis in Jupyter notebooks.\n\nThese include:\n\n`notebook.filePicker` to selecting an existing file, `notebook.newFilePicker` for indicating a new path, and `notebook.folderPicker` for selecting an existing folder.\n> These all open PyQT file dialogs, to make it easy for users to interact with the file system. Customisation options exist, for example prompt and default locations.\n\n`showTabs(data, func, titles, progressBar=False)`\n> Useful for performing analysis repeated multiple times across e.g. different neurons, experiments, conditions, ...etc. Given either a list, or dictionary, all items will be iterated over, and each individually drawn onto their own tab with the provided `func`.\nThe method provided expects `func(idx, key, value)`, where `idx` is the (0-based) index of the tab, `key` is the list/dictionary key, and `value` is what is to be processed.\n\n## Processing Data\n\nCommon per-trace processing filters are provided within [filters.py](https://github.com/padster/pyNeuroTrace/tree/master/pyneurotrace/filters.py). These are all designed to take a numpy array of traces, with each row an independent trace, and all return a filtered array of the same size.\n\nThese include:\n\n`filters.deltaFOverF0(traces, hz, t0, t1, t2)`\n> Converts raw signal to the standard Delta-F-over-F0, using the technique given in [Jia et al, 2011](http://doi.org/10.1038/nprot.2010.169). The smoothing parameters (t0, t1, t2) are as described in the paper, all with units in seconds. Sample rate must also be provided to convert these to sample units.\n\n`filters.okada(traces)`\n> Reduces noise in traces by smoothing single peaks or valleys, as described in [Okada et al, 2016](https://doi.org/10.1371/journal.pone.0157595)\n\n\n\n## Event Detection\n\nPython implementations of the three algorithms discussed in our paper [Sakaki et al, 2018](https://doi.org/10.1109/EMBC.2018.8512983) for finding events within Calcium fluorescent traces.\n\n`ewma(data, weight)`\n> calculates the Exponentially-Weighted Moving Average for each trace, given how strongly to weight new points (vs. the previous average).\n\n`cusum(data, slack)`\n> calculates the Cumulative Sum for each trace, taking a slack parameter which controls how far from the mean the signal the signal needs to be to not be considered noise.\n\n`matchedFilter(data, windowSize, A, tA, tB)`\n> calculates the likelihood ratio for each sample to be the end of a window of expected transient shape, being a double exponential with amplitude A, rise-time tA, and decay-time tB (in samples).\n\nThe results of each of these three detection filters can then be passed through `thresholdEvents(data, threshold)`, to register detected events whenever the filter strength increases above the given threshold.\n\n## Reading Data (lab-specific)\n\nThe code within this repository was designed to read data from experiments performed by the [Kurt Haas lab](https://cps.med.ubc.ca/faculty/haas/) at UBC. If you're from this lab, read below. If not, this part is probably not relevant, but fee free to ask if you'd be interested in loading your own data file formats.\n\nA number of options for loading data files are available within [files.py](https://github.com/padster/pyNeuroTrace/tree/master/pyneurotrace/files.py), including:\n* `load2PData(path)` takes an experiment output file (e.g STEP_5_EXPT.TXT) and returns ID, XYZ location, and raw intensity values for each node in the experiment.\n* `loadMetadata(path)` takes a metadata file (e.g. rscan_metadata_step_5.txt) and returns the stimulus start/stop samples, as well as the sample rate for the experiment.\n* `loadTreeStructure(path)` takes a tree structure file (e.g. interp-neuron-.txt) and returns the mapping of node IDs to tree information about that node (e.g. node type, children, ...).\n",
"bugtrack_url": null,
"license": null,
"summary": "Python code for neural time series",
"version": "1.0.1",
"project_urls": {
"Homepage": "https://github.com/padster/pyNeuroTrace",
"Issues": "https://github.com/padster/pyNeuroTrace/issues"
},
"split_keywords": [
"neuron",
" analysis",
" timeseries",
" neuroscience"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "7604665a85446b1f9872165d8177a58498950c5d913ffc594abc76b51411ca71",
"md5": "ebb3cc02fbfd890e82bb0d532b4b5378",
"sha256": "6dcb94c7e40fc455f152d9eb0ae12cf54e7406138ac2769568ee30d37bfb31ba"
},
"downloads": -1,
"filename": "pyNeuroTrace-1.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ebb3cc02fbfd890e82bb0d532b4b5378",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 35621,
"upload_time": "2024-08-13T16:27:27",
"upload_time_iso_8601": "2024-08-13T16:27:27.005152Z",
"url": "https://files.pythonhosted.org/packages/76/04/665a85446b1f9872165d8177a58498950c5d913ffc594abc76b51411ca71/pyNeuroTrace-1.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "5f2be5b2c26278f74947df4740763f177957eb5e9a0f20a786e9ee38ecec0b8c",
"md5": "647f3458401125a398a3cc8605085ec9",
"sha256": "4b0cb285464d8e315dfafca8164f8f25e04e66e838686aa10a8abf008a161a15"
},
"downloads": -1,
"filename": "pyneurotrace-1.0.1.tar.gz",
"has_sig": false,
"md5_digest": "647f3458401125a398a3cc8605085ec9",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 34774,
"upload_time": "2024-08-13T16:27:28",
"upload_time_iso_8601": "2024-08-13T16:27:28.518219Z",
"url": "https://files.pythonhosted.org/packages/5f/2b/e5b2c26278f74947df4740763f177957eb5e9a0f20a786e9ee38ecec0b8c/pyneurotrace-1.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-13 16:27:28",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "padster",
"github_project": "pyNeuroTrace",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "setuptools",
"specs": [
[
">=",
"70.0.0"
]
]
},
{
"name": "wheel",
"specs": []
},
{
"name": "ipython",
"specs": []
},
{
"name": "ipywidgets",
"specs": []
},
{
"name": "matplotlib",
"specs": []
},
{
"name": "matplotlib-scalebar",
"specs": []
},
{
"name": "numpy",
"specs": [
[
"<",
"2.0.0"
]
]
},
{
"name": "pytest",
"specs": []
},
{
"name": "PyQt5",
"specs": []
},
{
"name": "scipy",
"specs": []
},
{
"name": "tqdm",
"specs": []
}
],
"lcname": "pyneurotrace"
}