neurocaps


Nameneurocaps JSON
Version 0.34.3 PyPI version JSON
download
home_pageNone
SummaryCo-activation Patterns (CAPs) Python package
upload_time2025-07-26 02:46:07
maintainerNone
docs_urlNone
authorNone
requires_python<3.13,>=3.9.0
licenseNone
keywords python co-activation patterns caps neuroimaging fmri dfc dynamic functional connectivity fmriprep
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # NeuroCAPs: Neuroimaging Co-Activation Patterns

[![Latest Version](https://img.shields.io/pypi/v/neurocaps.svg)](https://pypi.python.org/pypi/neurocaps/)
[![Python Versions](https://img.shields.io/pypi/pyversions/neurocaps.svg)](https://pypi.python.org/pypi/neurocaps/)
[![DOI](https://img.shields.io/badge/DOI-10.5281%2Fzenodo.11642615-teal)](https://doi.org/10.5281/zenodo.16430050)
[![Github Repository](https://img.shields.io/badge/Source%20Code-neurocaps-purple)](https://github.com/donishadsmith/neurocaps)
[![Test Status](https://github.com/donishadsmith/neurocaps/actions/workflows/testing.yaml/badge.svg)](https://github.com/donishadsmith/neurocaps/actions/workflows/testing.yaml)
[![Documentation Status](https://readthedocs.org/projects/neurocaps/badge/?version=stable)](http://neurocaps.readthedocs.io/en/stable/?badge=stable)
[![Codecov](https://codecov.io/github/donishadsmith/neurocaps/graph/badge.svg?token=WS2V7I16WF)](https://codecov.io/github/donishadsmith/neurocaps)
[![Code Style: Black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![License](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT)
![Platform Support](https://img.shields.io/badge/OS-Ubuntu%20|%20macOS%20|%20Windows-blue)
[![Docker](https://img.shields.io/badge/docker-donishadsmith/neurocaps-darkblue.svg?logo=docker)](https://hub.docker.com/r/donishadsmith/neurocaps/tags/)
[![JOSS](https://joss.theoj.org/papers/0e5c44d5d82402fa0f28e6a8833428f0/status.svg)](https://joss.theoj.org/papers/0e5c44d5d82402fa0f28e6a8833428f0)

NeuroCAPs (**Neuro**imaging **C**o-**A**ctivation **P**attern**s**) is a Python package for performing Co-Activation
Patterns (CAPs) analyses on resting-state or task-based fMRI data. CAPs identifies recurring brain states by applying
k-means clustering on BOLD timeseries data [^1].

<img src="docs/assets/workflow.png">

## Installation
**NeuroCAPs requires Python 3.9-3.12.**

To install NeuroCAPs, follow the instructions below using your preferred terminal.

### Standard Installation from PyPi
```bash

pip install neurocaps

```

#### Windows Users
PyBIDS will not be installed by default due to installation errors that may occur if long paths
aren't enabled (Refer to official [Microsoft documentation](https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=powershell)
to enable this feature).

To include PyBIDS in your installation, use:

```bash

pip install neurocaps[windows]

```

Alternatively, you can install PyBIDS separately:

```bash

pip install pybids

```
### Installation from Source (Development Version)
To install the latest development version from the source, there are two options:

1. Install directly via pip:
```bash

pip install git+https://github.com/donishadsmith/neurocaps.git

```

2. Clone the repository and install locally:

```bash

git clone --depth 1 https://github.com/donishadsmith/neurocaps/
cd neurocaps
pip install -e .
# Clone with submodules to include test dataset ~140 MB
git submodule update --init

```

#### Windows Users
To include PyBIDS when installing the development version on Windows, use:

```bash

git clone --depth 1 https://github.com/donishadsmith/neurocaps/
cd neurocaps
pip install -e .[windows]
# Clone with submodules to include test dataset ~140 MB
git submodule update --init
```

## Docker
If [Docker](https://docs.docker.com/) is available on your system, you can use the NeuroCAPs Docker
image, which includes the demos and configures a headless display for VTK.

To pull the Docker image:
```bash

docker pull donishadsmith/neurocaps && docker tag donishadsmith/neurocaps neurocaps
```

The image can be run as:

1. An interactive bash session (default):

```bash

docker run -it neurocaps
```

2. A Jupyter Notebook with port forwarding:

```bash

docker run -it -p 9999:9999 neurocaps notebook
```

## Features
NeuroCAPs is built around two main classes (``TimeseriesExtractor`` and ``CAP``) and includes several
features to perform the complete CAPs workflow from postprocessing to visualizations.
Notable features includes:

- Timeseries Extraction (``TimeseriesExtractor``):
    - extracts BOLD timeseries from resting-state or task-based fMRI data
    - supports deterministic parcellations such as the Schaefer and AAL, in addition to custom-defined deterministic parcellations
    - performs nuisance regression, motion scrubbing, and additional features
    - reports quality control information based on framewise displacement

    **Important**:
       NeuroCAPs is most optimized for fMRI data preprocessed with
       [fMRIPrep](https://fmriprep.org/en/stable/) and assumes the data is BIDs compliant.
       Refer to [NeuroCAPs' BIDS Structure and Entities Documentation](https://neurocaps.readthedocs.io/en/stable/bids.html)
       for additional information.

- CAPs Analysis (``CAP``):
    - performs k-means clustering on individuals or groups
    - identifies the optimal number of clusters using Silhouette, Elbow, Davies Bouldin, or Variance Ratio methods
    - computes several temporal dynamic metrics [^2] [^3]:
        - temporal fraction (fraction of time)
        - persistence (dwell time)
        - counts (state initiation)
        - transition frequency & probability
    - produces several visualizations:
        - heatmaps and outer product plots
        - surface plots
        - correlation matrices
        - cosine similarity radar plots [^4] [^5]

- Utilities:
  - plot transition matrices
  - merge timeseries data across tasks or session [^6]
  - generate the custom parcellation dictionary structure from the parcellation's metadata file
  - fetch preset custom parcellation approaches

Full details for every function and parameter are available in the
[API Documentation](https://neurocaps.readthedocs.io/en/stable/api.html).

## Workflow
The following code demonstrates a high-level workflow overview using NeuroCAPs to perform the CAPs
analysis. An interactive variant of this workflow is available on the
[readthedocs](https://neurocaps.readthedocs.io/en/stable/tutorials/tutorial-8.html). Additional
[tutorials]([demos](https://neurocaps.readthedocs.io/en/stable/tutorials/)) and
[interactive demonstrations](https://github.com/donishadsmith/neurocaps/tree/main/demos) are
also provided.

1. Extract timeseries data
```python
from neurocaps.extraction import TimeseriesExtractor

# Using Schaefer, one of the default parcellation approaches
parcel_approach = {"Schaefer": {"n_rois": 100, "yeo_networks": 7}}

# List of fMRIPrep-derived confounds for nuisance regression
confound_names = [
    "cosine*",
    "trans_x",
    "trans_x_derivative1",
    "trans_y",
    "trans_y_derivative1",
    "trans_z",
    "trans_z_derivative1",
    "rot_x",
    "rot_x_derivative1",
    "rot_y",
    "rot_y_derivative1",
    "rot_z",
    "rot_z_derivative1",
    "a_comp_cor_00",
    "a_comp_cor_01",
    "a_comp_cor_02",
    "a_comp_cor_03",
    "a_comp_cor_04",
]

# Initialize extractor with signal cleaning parameters
extractor = TimeseriesExtractor(
    space="MNI152NLin2009cAsym",
    parcel_approach=parcel_approach,
    confound_names=confound_names,
    standardize=False,
    fd_threshold={
        "threshold": 0.50,
        "outlier_percentage": 0.30,
    },
)

# Extract BOLD data from preprocessed fMRIPrep data
# which should be located in the "derivatives" folder
# within the BIDS root directory
# The extracted timeseries data is automatically stored
extractor.get_bold(
    bids_dir="path/to/bids/root",
    pipeline_name="fmriprep",
    session="1",
    task="rest",
    tr=2,
    verbose=False,
)

# Retrieve the dataframe containing QC information for each subject
# to use for downstream statistical analyses
qc_df = extractor.report_qc()
print(qc_df)
```

2. Use k-means clustering to identify the optimal number of CAPs from the data using a heuristic
```python
from neurocaps.analysis import CAP

# Initialize CAP class
cap_analysis = CAP(parcel_approach=extractor.parcel_approach)

# Identify the optimal number of CAPs (clusters)
# using the elbow method to test 2-20
# The optimal number of CAPs is automatically stored
cap_analysis.get_caps(
    subject_timeseries=extractor.subject_timeseries,
    n_clusters=range(2, 21),
    standardize=True,
    cluster_selection_method="elbow",
    max_iter=500,
    n_init=10,
    random_state=0,
)
```

3. Compute temporal dynamic metrics for downstream statistical analyses
```python
# Calculate temporal fraction of each CAP for all subjects
output = cap_analysis.calculate_metrics(extractor.subject_timeseries, metrics=["temporal_fraction"])
print(output["temporal_fraction"])
```

4. Visualize CAPs
```python
# Project CAPs onto surface plots  and generate cosine similarity network alignment of CAPs
cap_analysis.caps2surf().caps2radar()
```

## Acknowledgements
NeuroCAPs relies on several popular data processing, machine learning, neuroimaging, and visualization
[packages](https://neurocaps.readthedocs.io/en/stable/#dependencies).

Additionally, some foundational concepts in this package take inspiration from features or design
patterns implemented in other neuroimaging Python packages, specically:

- mtorabi59's [pydfc](https://github.com/neurodatascience/dFC), a toolbox that allows comparisons
among several popular dynamic functionality methods.
- 62442katieb's [IDConn](https://github.com/62442katieb/IDConn), a pipeline for assessing individual
differences in resting-state or task-based functional connectivity.

## Reporting Issues
Bug reports, feature requests, and documentation enhancements can be reported using the
templates offered when creating a new issue in the
[issue tracker](https://github.com/donishadsmith/neurocaps/issues).

## Contributing
Please refer the [contributing guidelines](https://neurocaps.readthedocs.io/en/stable/contributing.html)
on how to contribute to NeuroCAPs.

## References
[^1]: Liu, X., Chang, C., & Duyn, J. H. (2013). Decomposition of spontaneous brain activity into
distinct fMRI co-activation patterns. Frontiers in Systems Neuroscience, 7.
https://doi.org/10.3389/fnsys.2013.00101

[^2]: Liu, X., Zhang, N., Chang, C., & Duyn, J. H. (2018). Co-activation patterns in resting-state
fMRI signals. NeuroImage, 180, 485–494. https://doi.org/10.1016/j.neuroimage.2018.01.041

[^3]: Yang, H., Zhang, H., Di, X., Wang, S., Meng, C., Tian, L., & Biswal, B. (2021). Reproducible
coactivation patterns of functional brain networks reveal the aberrant dynamic state transition in
schizophrenia. NeuroImage, 237, 118193. https://doi.org/10.1016/j.neuroimage.2021.118193

[^4]: Zhang, R., Yan, W., Manza, P., Shokri-Kojori, E., Demiral, S. B., Schwandt, M., Vines, L.,
Sotelo, D., Tomasi, D., Giddens, N. T., Wang, G., Diazgranados, N., Momenan, R., & Volkow, N. D. (2023).
Disrupted brain state dynamics in opioid and alcohol use disorder: attenuation by nicotine use.
Neuropsychopharmacology, 49(5), 876–884. https://doi.org/10.1038/s41386-023-01750-w

[^5]: Ingwersen, T., Mayer, C., Petersen, M., Frey, B. M., Fiehler, J., Hanning, U., Kühn, S.,
Gallinat, J., Twerenbold, R., Gerloff, C., Cheng, B., Thomalla, G., & Schlemm, E. (2024).
Functional MRI brain state occupancy in the presence of cerebral small vessel disease —
A pre-registered replication analysis of the Hamburg City Health Study. Imaging Neuroscience,
2, 1–17. https://doi.org/10.1162/imag_a_00122

[^6]: Kupis, L., Romero, C., Dirks, B., Hoang, S., Parladé, M. V., Beaumont, A. L., Cardona, S. M.,
Alessandri, M., Chang, C., Nomi, J. S., & Uddin, L. Q. (2020). Evoked and intrinsic brain network
dynamics in children with autism spectrum disorder. NeuroImage: Clinical, 28, 102396.
https://doi.org/10.1016/j.nicl.2020.102396

[^7]: Hyunwoo Gu and Joonwon Lee and Sungje Kim and Jaeseob Lim and Hyang-Jung Lee and Heeseung Lee
and Minjin Choe and Dong-Gyu Yoo and Jun Hwan (Joshua) Ryu and Sukbin Lim and Sang-Hun Lee (2024).
Discrimination-Estimation Task. OpenNeuro. [Dataset] doi: https://doi.org/10.18112/openneuro.ds005381.v1.0.0

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "neurocaps",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.13,>=3.9.0",
    "maintainer_email": null,
    "keywords": "python, Co-Activation Patterns, CAPs, neuroimaging, fmri, dfc, dynamic functional connectivity, fMRIPrep",
    "author": null,
    "author_email": "Donisha Smith <donishasmith@outlook.com>",
    "download_url": "https://files.pythonhosted.org/packages/8e/78/a918ae838302804026c6b51b3b92e6ef256e769af7e375a1e82b6271f429/neurocaps-0.34.3.tar.gz",
    "platform": null,
    "description": "# NeuroCAPs: Neuroimaging Co-Activation Patterns\n\n[![Latest Version](https://img.shields.io/pypi/v/neurocaps.svg)](https://pypi.python.org/pypi/neurocaps/)\n[![Python Versions](https://img.shields.io/pypi/pyversions/neurocaps.svg)](https://pypi.python.org/pypi/neurocaps/)\n[![DOI](https://img.shields.io/badge/DOI-10.5281%2Fzenodo.11642615-teal)](https://doi.org/10.5281/zenodo.16430050)\n[![Github Repository](https://img.shields.io/badge/Source%20Code-neurocaps-purple)](https://github.com/donishadsmith/neurocaps)\n[![Test Status](https://github.com/donishadsmith/neurocaps/actions/workflows/testing.yaml/badge.svg)](https://github.com/donishadsmith/neurocaps/actions/workflows/testing.yaml)\n[![Documentation Status](https://readthedocs.org/projects/neurocaps/badge/?version=stable)](http://neurocaps.readthedocs.io/en/stable/?badge=stable)\n[![Codecov](https://codecov.io/github/donishadsmith/neurocaps/graph/badge.svg?token=WS2V7I16WF)](https://codecov.io/github/donishadsmith/neurocaps)\n[![Code Style: Black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n[![License](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT)\n![Platform Support](https://img.shields.io/badge/OS-Ubuntu%20|%20macOS%20|%20Windows-blue)\n[![Docker](https://img.shields.io/badge/docker-donishadsmith/neurocaps-darkblue.svg?logo=docker)](https://hub.docker.com/r/donishadsmith/neurocaps/tags/)\n[![JOSS](https://joss.theoj.org/papers/0e5c44d5d82402fa0f28e6a8833428f0/status.svg)](https://joss.theoj.org/papers/0e5c44d5d82402fa0f28e6a8833428f0)\n\nNeuroCAPs (**Neuro**imaging **C**o-**A**ctivation **P**attern**s**) is a Python package for performing Co-Activation\nPatterns (CAPs) analyses on resting-state or task-based fMRI data. CAPs identifies recurring brain states by applying\nk-means clustering on BOLD timeseries data [^1].\n\n<img src=\"docs/assets/workflow.png\">\n\n## Installation\n**NeuroCAPs requires Python 3.9-3.12.**\n\nTo install NeuroCAPs, follow the instructions below using your preferred terminal.\n\n### Standard Installation from PyPi\n```bash\n\npip install neurocaps\n\n```\n\n#### Windows Users\nPyBIDS will not be installed by default due to installation errors that may occur if long paths\naren't enabled (Refer to official [Microsoft documentation](https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=powershell)\nto enable this feature).\n\nTo include PyBIDS in your installation, use:\n\n```bash\n\npip install neurocaps[windows]\n\n```\n\nAlternatively, you can install PyBIDS separately:\n\n```bash\n\npip install pybids\n\n```\n### Installation from Source (Development Version)\nTo install the latest development version from the source, there are two options:\n\n1. Install directly via pip:\n```bash\n\npip install git+https://github.com/donishadsmith/neurocaps.git\n\n```\n\n2. Clone the repository and install locally:\n\n```bash\n\ngit clone --depth 1 https://github.com/donishadsmith/neurocaps/\ncd neurocaps\npip install -e .\n# Clone with submodules to include test dataset ~140 MB\ngit submodule update --init\n\n```\n\n#### Windows Users\nTo include PyBIDS when installing the development version on Windows, use:\n\n```bash\n\ngit clone --depth 1 https://github.com/donishadsmith/neurocaps/\ncd neurocaps\npip install -e .[windows]\n# Clone with submodules to include test dataset ~140 MB\ngit submodule update --init\n```\n\n## Docker\nIf [Docker](https://docs.docker.com/) is available on your system, you can use the NeuroCAPs Docker\nimage, which includes the demos and configures a headless display for VTK.\n\nTo pull the Docker image:\n```bash\n\ndocker pull donishadsmith/neurocaps && docker tag donishadsmith/neurocaps neurocaps\n```\n\nThe image can be run as:\n\n1. An interactive bash session (default):\n\n```bash\n\ndocker run -it neurocaps\n```\n\n2. A Jupyter Notebook with port forwarding:\n\n```bash\n\ndocker run -it -p 9999:9999 neurocaps notebook\n```\n\n## Features\nNeuroCAPs is built around two main classes (``TimeseriesExtractor`` and ``CAP``) and includes several\nfeatures to perform the complete CAPs workflow from postprocessing to visualizations.\nNotable features includes:\n\n- Timeseries Extraction (``TimeseriesExtractor``):\n    - extracts BOLD timeseries from resting-state or task-based fMRI data\n    - supports deterministic parcellations such as the Schaefer and AAL, in addition to custom-defined deterministic parcellations\n    - performs nuisance regression, motion scrubbing, and additional features\n    - reports quality control information based on framewise displacement\n\n    **Important**:\n       NeuroCAPs is most optimized for fMRI data preprocessed with\n       [fMRIPrep](https://fmriprep.org/en/stable/) and assumes the data is BIDs compliant.\n       Refer to [NeuroCAPs' BIDS Structure and Entities Documentation](https://neurocaps.readthedocs.io/en/stable/bids.html)\n       for additional information.\n\n- CAPs Analysis (``CAP``):\n    - performs k-means clustering on individuals or groups\n    - identifies the optimal number of clusters using Silhouette, Elbow, Davies Bouldin, or Variance Ratio methods\n    - computes several temporal dynamic metrics [^2] [^3]:\n        - temporal fraction (fraction of time)\n        - persistence (dwell time)\n        - counts (state initiation)\n        - transition frequency & probability\n    - produces several visualizations:\n        - heatmaps and outer product plots\n        - surface plots\n        - correlation matrices\n        - cosine similarity radar plots [^4] [^5]\n\n- Utilities:\n  - plot transition matrices\n  - merge timeseries data across tasks or session [^6]\n  - generate the custom parcellation dictionary structure from the parcellation's metadata file\n  - fetch preset custom parcellation approaches\n\nFull details for every function and parameter are available in the\n[API Documentation](https://neurocaps.readthedocs.io/en/stable/api.html).\n\n## Workflow\nThe following code demonstrates a high-level workflow overview using NeuroCAPs to perform the CAPs\nanalysis. An interactive variant of this workflow is available on the\n[readthedocs](https://neurocaps.readthedocs.io/en/stable/tutorials/tutorial-8.html). Additional\n[tutorials]([demos](https://neurocaps.readthedocs.io/en/stable/tutorials/)) and\n[interactive demonstrations](https://github.com/donishadsmith/neurocaps/tree/main/demos) are\nalso provided.\n\n1. Extract timeseries data\n```python\nfrom neurocaps.extraction import TimeseriesExtractor\n\n# Using Schaefer, one of the default parcellation approaches\nparcel_approach = {\"Schaefer\": {\"n_rois\": 100, \"yeo_networks\": 7}}\n\n# List of fMRIPrep-derived confounds for nuisance regression\nconfound_names = [\n    \"cosine*\",\n    \"trans_x\",\n    \"trans_x_derivative1\",\n    \"trans_y\",\n    \"trans_y_derivative1\",\n    \"trans_z\",\n    \"trans_z_derivative1\",\n    \"rot_x\",\n    \"rot_x_derivative1\",\n    \"rot_y\",\n    \"rot_y_derivative1\",\n    \"rot_z\",\n    \"rot_z_derivative1\",\n    \"a_comp_cor_00\",\n    \"a_comp_cor_01\",\n    \"a_comp_cor_02\",\n    \"a_comp_cor_03\",\n    \"a_comp_cor_04\",\n]\n\n# Initialize extractor with signal cleaning parameters\nextractor = TimeseriesExtractor(\n    space=\"MNI152NLin2009cAsym\",\n    parcel_approach=parcel_approach,\n    confound_names=confound_names,\n    standardize=False,\n    fd_threshold={\n        \"threshold\": 0.50,\n        \"outlier_percentage\": 0.30,\n    },\n)\n\n# Extract BOLD data from preprocessed fMRIPrep data\n# which should be located in the \"derivatives\" folder\n# within the BIDS root directory\n# The extracted timeseries data is automatically stored\nextractor.get_bold(\n    bids_dir=\"path/to/bids/root\",\n    pipeline_name=\"fmriprep\",\n    session=\"1\",\n    task=\"rest\",\n    tr=2,\n    verbose=False,\n)\n\n# Retrieve the dataframe containing QC information for each subject\n# to use for downstream statistical analyses\nqc_df = extractor.report_qc()\nprint(qc_df)\n```\n\n2. Use k-means clustering to identify the optimal number of CAPs from the data using a heuristic\n```python\nfrom neurocaps.analysis import CAP\n\n# Initialize CAP class\ncap_analysis = CAP(parcel_approach=extractor.parcel_approach)\n\n# Identify the optimal number of CAPs (clusters)\n# using the elbow method to test 2-20\n# The optimal number of CAPs is automatically stored\ncap_analysis.get_caps(\n    subject_timeseries=extractor.subject_timeseries,\n    n_clusters=range(2, 21),\n    standardize=True,\n    cluster_selection_method=\"elbow\",\n    max_iter=500,\n    n_init=10,\n    random_state=0,\n)\n```\n\n3. Compute temporal dynamic metrics for downstream statistical analyses\n```python\n# Calculate temporal fraction of each CAP for all subjects\noutput = cap_analysis.calculate_metrics(extractor.subject_timeseries, metrics=[\"temporal_fraction\"])\nprint(output[\"temporal_fraction\"])\n```\n\n4. Visualize CAPs\n```python\n# Project CAPs onto surface plots  and generate cosine similarity network alignment of CAPs\ncap_analysis.caps2surf().caps2radar()\n```\n\n## Acknowledgements\nNeuroCAPs relies on several popular data processing, machine learning, neuroimaging, and visualization\n[packages](https://neurocaps.readthedocs.io/en/stable/#dependencies).\n\nAdditionally, some foundational concepts in this package take inspiration from features or design\npatterns implemented in other neuroimaging Python packages, specically:\n\n- mtorabi59's [pydfc](https://github.com/neurodatascience/dFC), a toolbox that allows comparisons\namong several popular dynamic functionality methods.\n- 62442katieb's [IDConn](https://github.com/62442katieb/IDConn), a pipeline for assessing individual\ndifferences in resting-state or task-based functional connectivity.\n\n## Reporting Issues\nBug reports, feature requests, and documentation enhancements can be reported using the\ntemplates offered when creating a new issue in the\n[issue tracker](https://github.com/donishadsmith/neurocaps/issues).\n\n## Contributing\nPlease refer the [contributing guidelines](https://neurocaps.readthedocs.io/en/stable/contributing.html)\non how to contribute to NeuroCAPs.\n\n## References\n[^1]: Liu, X., Chang, C., & Duyn, J. H. (2013). Decomposition of spontaneous brain activity into\ndistinct fMRI co-activation patterns. Frontiers in Systems Neuroscience, 7.\nhttps://doi.org/10.3389/fnsys.2013.00101\n\n[^2]: Liu, X., Zhang, N., Chang, C., & Duyn, J. H. (2018). Co-activation patterns in resting-state\nfMRI signals. NeuroImage, 180, 485\u2013494. https://doi.org/10.1016/j.neuroimage.2018.01.041\n\n[^3]: Yang, H., Zhang, H., Di, X., Wang, S., Meng, C., Tian, L., & Biswal, B. (2021). Reproducible\ncoactivation patterns of functional brain networks reveal the aberrant dynamic state transition in\nschizophrenia. NeuroImage, 237, 118193. https://doi.org/10.1016/j.neuroimage.2021.118193\n\n[^4]: Zhang, R., Yan, W., Manza, P., Shokri-Kojori, E., Demiral, S. B., Schwandt, M., Vines, L.,\nSotelo, D., Tomasi, D., Giddens, N. T., Wang, G., Diazgranados, N., Momenan, R., & Volkow, N. D. (2023).\nDisrupted brain state dynamics in opioid and alcohol use disorder: attenuation by nicotine use.\nNeuropsychopharmacology, 49(5), 876\u2013884. https://doi.org/10.1038/s41386-023-01750-w\n\n[^5]: Ingwersen, T., Mayer, C., Petersen, M., Frey, B. M., Fiehler, J., Hanning, U., K\u00fchn, S.,\nGallinat, J., Twerenbold, R., Gerloff, C., Cheng, B., Thomalla, G., & Schlemm, E. (2024).\nFunctional MRI brain state occupancy in the presence of cerebral small vessel disease \u2014\nA pre-registered replication analysis of the Hamburg City Health Study. Imaging Neuroscience,\n2, 1\u201317. https://doi.org/10.1162/imag_a_00122\n\n[^6]: Kupis, L., Romero, C., Dirks, B., Hoang, S., Parlad\u00e9, M. V., Beaumont, A. L., Cardona, S. M.,\nAlessandri, M., Chang, C., Nomi, J. S., & Uddin, L. Q. (2020). Evoked and intrinsic brain network\ndynamics in children with autism spectrum disorder. NeuroImage: Clinical, 28, 102396.\nhttps://doi.org/10.1016/j.nicl.2020.102396\n\n[^7]: Hyunwoo Gu and Joonwon Lee and Sungje Kim and Jaeseob Lim and Hyang-Jung Lee and Heeseung Lee\nand Minjin Choe and Dong-Gyu Yoo and Jun Hwan (Joshua) Ryu and Sukbin Lim and Sang-Hun Lee (2024).\nDiscrimination-Estimation Task. OpenNeuro. [Dataset] doi: https://doi.org/10.18112/openneuro.ds005381.v1.0.0\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Co-activation Patterns (CAPs) Python package",
    "version": "0.34.3",
    "project_urls": {
        "Changelog": "https://neurocaps.readthedocs.io/en/stable/changelog.html",
        "Github": "https://github.com/donishadsmith/neurocaps",
        "Homepage": "https://neurocaps.readthedocs.io",
        "Issues": "https://github.com/donishadsmith/neurocaps/issues"
    },
    "split_keywords": [
        "python",
        " co-activation patterns",
        " caps",
        " neuroimaging",
        " fmri",
        " dfc",
        " dynamic functional connectivity",
        " fmriprep"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "56b3ab82ab97be20c4f26ea8e6579cf23724b6a9537af05cf18695bec85a467e",
                "md5": "9f19eb38ab2d52fbe4a4b6440c25e039",
                "sha256": "0cfe6d14b8e12dc15587ac605b7e39c15b44f2eda09f5ce61bd03427353e42d5"
            },
            "downloads": -1,
            "filename": "neurocaps-0.34.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9f19eb38ab2d52fbe4a4b6440c25e039",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.13,>=3.9.0",
            "size": 125192,
            "upload_time": "2025-07-26T02:46:05",
            "upload_time_iso_8601": "2025-07-26T02:46:05.929287Z",
            "url": "https://files.pythonhosted.org/packages/56/b3/ab82ab97be20c4f26ea8e6579cf23724b6a9537af05cf18695bec85a467e/neurocaps-0.34.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "8e78a918ae838302804026c6b51b3b92e6ef256e769af7e375a1e82b6271f429",
                "md5": "134b8ddb6cefd38d446a3271d27ec921",
                "sha256": "7132abefef699c25af2b994f86406c0578cec8980a823709054a7b37a586bfbf"
            },
            "downloads": -1,
            "filename": "neurocaps-0.34.3.tar.gz",
            "has_sig": false,
            "md5_digest": "134b8ddb6cefd38d446a3271d27ec921",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.13,>=3.9.0",
            "size": 115413,
            "upload_time": "2025-07-26T02:46:07",
            "upload_time_iso_8601": "2025-07-26T02:46:07.410868Z",
            "url": "https://files.pythonhosted.org/packages/8e/78/a918ae838302804026c6b51b3b92e6ef256e769af7e375a1e82b6271f429/neurocaps-0.34.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-26 02:46:07",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "donishadsmith",
    "github_project": "neurocaps",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "neurocaps"
}
        
Elapsed time: 1.44606s