neurocaps


Nameneurocaps JSON
Version 0.35.2 PyPI version JSON
download
home_pageNone
SummaryCo-activation Patterns (CAPs) Python package
upload_time2025-08-15 07:15:18
maintainerNone
docs_urlNone
authorNone
requires_python<3.13,>=3.9.0
licenseNone
keywords python co-activation patterns caps neuroimaging fmri dfc dynamic functional connectivity fmriprep
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # NeuroCAPs: Neuroimaging Co-Activation Patterns

[![Latest Version](https://img.shields.io/pypi/v/neurocaps.svg)](https://pypi.python.org/pypi/neurocaps/)
[![Python Versions](https://img.shields.io/pypi/pyversions/neurocaps.svg)](https://pypi.python.org/pypi/neurocaps/)
[![DOI](https://img.shields.io/badge/DOI-10.5281%2Fzenodo.11642615-teal)](https://doi.org/10.5281/zenodo.16880388)
[![Test Status](https://github.com/donishadsmith/neurocaps/actions/workflows/testing.yaml/badge.svg)](https://github.com/donishadsmith/neurocaps/actions/workflows/testing.yaml)
[![Documentation Status](https://readthedocs.org/projects/neurocaps/badge/?version=stable)](http://neurocaps.readthedocs.io/en/stable/?badge=stable)
[![codecov](https://codecov.io/github/donishadsmith/neurocaps/branch/main/graph/badge.svg?token=WS2V7I16WF)](https://codecov.io/github/donishadsmith/neurocaps)
[![Docker](https://img.shields.io/badge/Docker-donishadsmith/neurocaps-darkblue.svg?logo=docker)](https://hub.docker.com/r/donishadsmith/neurocaps/tags/)
[![JOSS](https://joss.theoj.org/papers/0e5c44d5d82402fa0f28e6a8833428f0/status.svg)](https://joss.theoj.org/papers/0e5c44d5d82402fa0f28e6a8833428f0)

NeuroCAPs (**Neuro**imaging **C**o-**A**ctivation **P**attern**s**) is a Python package for performing Co-Activation
Patterns (CAPs) analyses on resting-state or task-based fMRI data. CAPs identifies recurring brain states by applying
k-means clustering on BOLD timeseries data [^1].

<p align="center">
 <img src="docs/assets/workflow.png" width="400" height="700">
</p>

## Installation
**Requires Python 3.9-3.12.**

### Standard Installation
```bash
pip install neurocaps
```

**Windows Users**: Enable [long paths](https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=powershell)
and use:

```bash
pip install neurocaps[windows]
```

### Development Version

```bash
git clone https://github.com/donishadsmith/neurocaps/
cd neurocaps
pip install -e .

# For windows
# pip install -e .[windows]

# Clone with submodules to include test data ~140 MB
git submodule update --init
```

## Docker
A [Docker](https://docs.docker.com/) image is available with demos and headless VTK display configured:

```bash
# Pull image
docker pull donishadsmith/neurocaps && docker tag donishadsmith/neurocaps neurocaps

# Run interactive bash
docker run -it neurocaps

# Run Jupyter Notebook
docker run -it -p 9999:9999 neurocaps notebook
```

## Features
NeuroCAPs is built around two main classes (`TimeseriesExtractor` and `CAP`) and includes several
features to perform the complete CAPs workflow from postprocessing to visualizations.
Notable features includes:

| Component | Key Features |
| -------- | ------------|
| **Timeseries Extraction (`TimeseriesExtractor`)** | <ul><li>supports Schaefer, AAL, and deterministic custom parcellations</li><li>performs nuisance regression and motion scrubbing</li><li>reports quality control based on framewise displacement<br><br><b>Important</b>: Optimized for BIDS-compliant data preprocessed with <a href="https://fmriprep.org/en/stable/">fMRIPrep</a> and assumes data is BIDs compliant. Refer to <a href="https://neurocaps.readthedocs.io/en/stable/bids.html">NeuroCAPs' BIDS Structure and Entities Documentation</a> for additional information.</li></ul> |
| **CAPs Analysis (`CAP`)** | <ul><li>performs k-means clustering</li><li>finds the optimal number of clusters (silhouette, elbow, variance ratio, Davies-Bouldin)</li><li>computes temporal dynamic metrics (temporal fraction, persistence, counts, transition frequency and probabilities) [^2] [^3]</li><li>converts CAPs to NIfTI images</li><li>creates visualizations (heatmaps, outer products, surface plots, correlation matrices, cosine similarity radar plots [^4] [^5]).</li></ul> |
| **Standalone Functions** | <ul><li>plots transition matrices</li><li>merges timeseries data across tasks or sessions [^6]</li><li>generates and fetches custom parcellation approaches</li></ul> |

Full details for every function and parameter are available in the
[API Documentation](https://neurocaps.readthedocs.io/en/stable/api.html).

## Quick Start
The following code demonstrates basic usage of NeuroCAPs (with simulated data) to perform CAPs analysis.
A version of this example using real data from [OpenNeuro](https://openneuro.org/)
is available on the [readthedocs](https://neurocaps.readthedocs.io/en/stable/tutorials/tutorial-8.html).
Additional [tutorials]([demos](https://neurocaps.readthedocs.io/en/stable/tutorials/)) and
[interactive demonstrations](https://github.com/donishadsmith/neurocaps/tree/main/demos) are
also available.

1. Extract timeseries data
```python
import numpy as np
from neurocaps.extraction import TimeseriesExtractor
from neurocaps.utils import simulate_bids_dataset

# Set seed
np.random.seed(0)

# Generate a BIDS directory with fMRIPrep derivatives
bids_root = simulate_bids_dataset(n_subs=3, n_runs=1, n_volumes=100, task_name="rest")

# Using Schaefer, one of the default parcellation approaches
parcel_approach = {"Schaefer": {"n_rois": 100, "yeo_networks": 7}}

# List of fMRIPrep-derived confounds for nuisance regression
acompcor_names = [f"a_comp_cor_0{i}" for i in range(5)]
confound_names = ["cosine*", "trans*", "rot*", *acompcor_names]

# Initialize extractor with signal cleaning parameters
extractor = TimeseriesExtractor(
    space="MNI152NLin2009cAsym",
    parcel_approach=parcel_approach,
    confound_names=confound_names,
    standardize=False,
    # Run discarded if more than 30% of volumes exceed FD threshold
    fd_threshold={"threshold": 0.90, "outlier_percentage": 0.30},
)

# Extract preprocessed BOLD data
extractor.get_bold(bids_dir=bids_root, task="rest", tr=2, n_cores=1, verbose=False)

# Check QC information
qc_df = extractor.report_qc()
print(qc_df)
```
![Quality Control Dataframe.](paper/qc_df.png)

2. Use k-means clustering to identify the optimal number of CAPs from the data using a heuristic
```python
from neurocaps.analysis import CAP
from neurocaps.utils import PlotDefaults

# Initialize CAP class
cap_analysis = CAP(parcel_approach=extractor.parcel_approach, groups=None)

plot_kwargs = {**PlotDefaults.get_caps(), "figsize": (4, 3), "step": 2}

# Find optimal CAPs (2-20) using silhouette method; results are stored
cap_analysis.get_caps(
    subject_timeseries=extractor.subject_timeseries,
    n_clusters=range(2, 21),
    standardize=True,
    cluster_selection_method="silhouette",
    max_iter=500,
    n_init=10,
    show_figs=True,
    **plot_kwargs,
)
```
<img src="paper/silhouette_plot.png" alt="Silhouette Score Plot." style="width:46%; height:auto;">

3. Compute temporal dynamic metrics for downstream statistical analyses
```python
# Calculate temporal fraction of each CAP
metric_dict = cap_analysis.calculate_metrics(
    extractor.subject_timeseries, metrics=["temporal_fraction"]
)
print(metric_dict["temporal_fraction"])
```

![Temporal Fraction Dataframe.](paper/temporal_fraction_df.png)

Note that CAP-1 is the dominant brain state across subjects (highest frequency).

4. Visualize CAPs
```python
# Create surface and radar plots for each CAP
surface_kwargs = {**PlotDefaults.caps2surf(), "layout": "row", "size": (500, 100)}

radar_kwargs = {**PlotDefaults.caps2radar(), "height": 400, "width": 485}
radar_kwargs["radialaxis"] = {"range": [0, 0.4], "tickvals": [0.1, "", "", 0.4]}
radar_kwargs["legend"] = {"yanchor": "top", "y": 0.75, "x": 1.15}

cap_analysis.caps2surf(**surface_kwargs).caps2radar(**radar_kwargs)
```
<img src="paper/cap_1_surface.png" alt="CAP-1 Surface Image." style="width:46%; height:auto;">

<img src="paper/cap_2_surface.png" alt="CAP-2 Surface Image." style="width:46%; height:auto;">

<img src="paper/cap_1_radar.png" alt="CAP-1 Radar Image." style="width:46%; height:auto;">

<img src="paper/cap_2_radar.png" alt="CAP-2 Radar Image." style="width:46%; height:auto;">

Radar plots show network alignment (measured by cosine similarity): "High Amplitude" represents
alignment to activations (> 0), "Low Amplitude" represents alignment to deactivations (< 0).

Each CAP can be characterized using either maximum alignment
(CAP-1: Vis+/SomMot-; CAP-2: SomMot+/Vis-) or predominant alignment ("High Amplitude" − "Low Amplitude";
CAP-1: SalVentAttn+/SomMot-; CAP-2: SomMot+/SalVentAttn-).

```python
import pandas as pd

for cap_name in cap_analysis.caps["All Subjects"]:
    df = pd.DataFrame(cap_analysis.cosine_similarity["All Subjects"][cap_name])
    df["Net"] = df["High Amplitude"] - df["Low Amplitude"]
    df["Regions"] = cap_analysis.cosine_similarity["All Subjects"]["Regions"]
    print(f"{cap_name}:", "\n", df, "\n")
```
CAP-1:

<img src="paper/cap_1_alignment_df.png" alt="CAP-1 Network Alignment Dataframe." style="width:46%; height:auto;">

CAP-2:

<img src="paper/cap_2_alignment_df.png" alt="CAP-2 Network Alignment Dataframe." style="width:46%; height:auto;">

**Note**: For information about logging, refer to [NeuroCAPs' Logging Guide](https://neurocaps.readthedocs.io/en/stable/user_guide/logging.html).

## Citing
If you would like to cite NeuroCAPs, you can use:

```
   Smith, D., (2025). NeuroCAPs: A Python Package for Performing Co-Activation Patterns Analyses on Resting-State and Task-Based fMRI Data. Journal of Open Source Software, 10(112), 8196, https://doi.org/10.21105/joss.08196
```

## Reporting Issues
Bug reports, feature requests, and documentation enhancements can be reported using the
templates offered when creating a new issue in the
[issue tracker](https://github.com/donishadsmith/neurocaps/issues).

## Contributing
Please refer the [contributing guidelines](https://neurocaps.readthedocs.io/en/stable/contributing.html)
on how to contribute to NeuroCAPs.

## Acknowledgements
NeuroCAPs relies on several popular data processing, machine learning, neuroimaging, and visualization
[packages](https://neurocaps.readthedocs.io/en/stable/#dependencies).

Additionally, some foundational concepts in this package take inspiration from features or design
patterns implemented in other neuroimaging Python packages, specifically:

- mtorabi59's [pydfc](https://github.com/neurodatascience/dFC), a toolbox that allows comparisons
among several popular dynamic functionality methods.
- 62442katieb's [IDConn](https://github.com/62442katieb/IDConn), a pipeline for assessing individual
differences in resting-state or task-based functional connectivity.

## References
[^1]: Liu, X., Chang, C., & Duyn, J. H. (2013). Decomposition of spontaneous brain activity into
distinct fMRI co-activation patterns. Frontiers in Systems Neuroscience, 7.
https://doi.org/10.3389/fnsys.2013.00101

[^2]: Liu, X., Zhang, N., Chang, C., & Duyn, J. H. (2018). Co-activation patterns in resting-state
fMRI signals. NeuroImage, 180, 485–494. https://doi.org/10.1016/j.neuroimage.2018.01.041

[^3]: Yang, H., Zhang, H., Di, X., Wang, S., Meng, C., Tian, L., & Biswal, B. (2021). Reproducible
coactivation patterns of functional brain networks reveal the aberrant dynamic state transition in
schizophrenia. NeuroImage, 237, 118193. https://doi.org/10.1016/j.neuroimage.2021.118193

[^4]: Zhang, R., Yan, W., Manza, P., Shokri-Kojori, E., Demiral, S. B., Schwandt, M., Vines, L.,
Sotelo, D., Tomasi, D., Giddens, N. T., Wang, G., Diazgranados, N., Momenan, R., & Volkow, N. D. (2023).
Disrupted brain state dynamics in opioid and alcohol use disorder: attenuation by nicotine use.
Neuropsychopharmacology, 49(5), 876–884. https://doi.org/10.1038/s41386-023-01750-w

[^5]: Ingwersen, T., Mayer, C., Petersen, M., Frey, B. M., Fiehler, J., Hanning, U., Kühn, S.,
Gallinat, J., Twerenbold, R., Gerloff, C., Cheng, B., Thomalla, G., & Schlemm, E. (2024).
Functional MRI brain state occupancy in the presence of cerebral small vessel disease —
A pre-registered replication analysis of the Hamburg City Health Study. Imaging Neuroscience,
2, 1–17. https://doi.org/10.1162/imag_a_00122

[^6]: Kupis, L., Romero, C., Dirks, B., Hoang, S., Parladé, M. V., Beaumont, A. L., Cardona, S. M.,
Alessandri, M., Chang, C., Nomi, J. S., & Uddin, L. Q. (2020). Evoked and intrinsic brain network
dynamics in children with autism spectrum disorder. NeuroImage: Clinical, 28, 102396.
https://doi.org/10.1016/j.nicl.2020.102396

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "neurocaps",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.13,>=3.9.0",
    "maintainer_email": null,
    "keywords": "python, Co-Activation Patterns, CAPs, neuroimaging, fmri, dfc, dynamic functional connectivity, fMRIPrep",
    "author": null,
    "author_email": "Donisha Smith <donishasmith@outlook.com>",
    "download_url": "https://files.pythonhosted.org/packages/af/93/6660fcf4f54e3f572ca806e2d65461a0a99e71b4ce2f03efa5b82e89c7fb/neurocaps-0.35.2.tar.gz",
    "platform": null,
    "description": "# NeuroCAPs: Neuroimaging Co-Activation Patterns\n\n[![Latest Version](https://img.shields.io/pypi/v/neurocaps.svg)](https://pypi.python.org/pypi/neurocaps/)\n[![Python Versions](https://img.shields.io/pypi/pyversions/neurocaps.svg)](https://pypi.python.org/pypi/neurocaps/)\n[![DOI](https://img.shields.io/badge/DOI-10.5281%2Fzenodo.11642615-teal)](https://doi.org/10.5281/zenodo.16880388)\n[![Test Status](https://github.com/donishadsmith/neurocaps/actions/workflows/testing.yaml/badge.svg)](https://github.com/donishadsmith/neurocaps/actions/workflows/testing.yaml)\n[![Documentation Status](https://readthedocs.org/projects/neurocaps/badge/?version=stable)](http://neurocaps.readthedocs.io/en/stable/?badge=stable)\n[![codecov](https://codecov.io/github/donishadsmith/neurocaps/branch/main/graph/badge.svg?token=WS2V7I16WF)](https://codecov.io/github/donishadsmith/neurocaps)\n[![Docker](https://img.shields.io/badge/Docker-donishadsmith/neurocaps-darkblue.svg?logo=docker)](https://hub.docker.com/r/donishadsmith/neurocaps/tags/)\n[![JOSS](https://joss.theoj.org/papers/0e5c44d5d82402fa0f28e6a8833428f0/status.svg)](https://joss.theoj.org/papers/0e5c44d5d82402fa0f28e6a8833428f0)\n\nNeuroCAPs (**Neuro**imaging **C**o-**A**ctivation **P**attern**s**) is a Python package for performing Co-Activation\nPatterns (CAPs) analyses on resting-state or task-based fMRI data. CAPs identifies recurring brain states by applying\nk-means clustering on BOLD timeseries data [^1].\n\n<p align=\"center\">\n <img src=\"docs/assets/workflow.png\" width=\"400\" height=\"700\">\n</p>\n\n## Installation\n**Requires Python 3.9-3.12.**\n\n### Standard Installation\n```bash\npip install neurocaps\n```\n\n**Windows Users**: Enable [long paths](https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=powershell)\nand use:\n\n```bash\npip install neurocaps[windows]\n```\n\n### Development Version\n\n```bash\ngit clone https://github.com/donishadsmith/neurocaps/\ncd neurocaps\npip install -e .\n\n# For windows\n# pip install -e .[windows]\n\n# Clone with submodules to include test data ~140 MB\ngit submodule update --init\n```\n\n## Docker\nA [Docker](https://docs.docker.com/) image is available with demos and headless VTK display configured:\n\n```bash\n# Pull image\ndocker pull donishadsmith/neurocaps && docker tag donishadsmith/neurocaps neurocaps\n\n# Run interactive bash\ndocker run -it neurocaps\n\n# Run Jupyter Notebook\ndocker run -it -p 9999:9999 neurocaps notebook\n```\n\n## Features\nNeuroCAPs is built around two main classes (`TimeseriesExtractor` and `CAP`) and includes several\nfeatures to perform the complete CAPs workflow from postprocessing to visualizations.\nNotable features includes:\n\n| Component | Key Features |\n| -------- | ------------|\n| **Timeseries Extraction (`TimeseriesExtractor`)** | <ul><li>supports Schaefer, AAL, and deterministic custom parcellations</li><li>performs nuisance regression and motion scrubbing</li><li>reports quality control based on framewise displacement<br><br><b>Important</b>: Optimized for BIDS-compliant data preprocessed with <a href=\"https://fmriprep.org/en/stable/\">fMRIPrep</a> and assumes data is BIDs compliant. Refer to <a href=\"https://neurocaps.readthedocs.io/en/stable/bids.html\">NeuroCAPs' BIDS Structure and Entities Documentation</a> for additional information.</li></ul> |\n| **CAPs Analysis (`CAP`)** | <ul><li>performs k-means clustering</li><li>finds the optimal number of clusters (silhouette, elbow, variance ratio, Davies-Bouldin)</li><li>computes temporal dynamic metrics (temporal fraction, persistence, counts, transition frequency and probabilities) [^2] [^3]</li><li>converts CAPs to NIfTI images</li><li>creates visualizations (heatmaps, outer products, surface plots, correlation matrices, cosine similarity radar plots [^4] [^5]).</li></ul> |\n| **Standalone Functions** | <ul><li>plots transition matrices</li><li>merges timeseries data across tasks or sessions [^6]</li><li>generates and fetches custom parcellation approaches</li></ul> |\n\nFull details for every function and parameter are available in the\n[API Documentation](https://neurocaps.readthedocs.io/en/stable/api.html).\n\n## Quick Start\nThe following code demonstrates basic usage of NeuroCAPs (with simulated data) to perform CAPs analysis.\nA version of this example using real data from [OpenNeuro](https://openneuro.org/)\nis available on the [readthedocs](https://neurocaps.readthedocs.io/en/stable/tutorials/tutorial-8.html).\nAdditional [tutorials]([demos](https://neurocaps.readthedocs.io/en/stable/tutorials/)) and\n[interactive demonstrations](https://github.com/donishadsmith/neurocaps/tree/main/demos) are\nalso available.\n\n1. Extract timeseries data\n```python\nimport numpy as np\nfrom neurocaps.extraction import TimeseriesExtractor\nfrom neurocaps.utils import simulate_bids_dataset\n\n# Set seed\nnp.random.seed(0)\n\n# Generate a BIDS directory with fMRIPrep derivatives\nbids_root = simulate_bids_dataset(n_subs=3, n_runs=1, n_volumes=100, task_name=\"rest\")\n\n# Using Schaefer, one of the default parcellation approaches\nparcel_approach = {\"Schaefer\": {\"n_rois\": 100, \"yeo_networks\": 7}}\n\n# List of fMRIPrep-derived confounds for nuisance regression\nacompcor_names = [f\"a_comp_cor_0{i}\" for i in range(5)]\nconfound_names = [\"cosine*\", \"trans*\", \"rot*\", *acompcor_names]\n\n# Initialize extractor with signal cleaning parameters\nextractor = TimeseriesExtractor(\n    space=\"MNI152NLin2009cAsym\",\n    parcel_approach=parcel_approach,\n    confound_names=confound_names,\n    standardize=False,\n    # Run discarded if more than 30% of volumes exceed FD threshold\n    fd_threshold={\"threshold\": 0.90, \"outlier_percentage\": 0.30},\n)\n\n# Extract preprocessed BOLD data\nextractor.get_bold(bids_dir=bids_root, task=\"rest\", tr=2, n_cores=1, verbose=False)\n\n# Check QC information\nqc_df = extractor.report_qc()\nprint(qc_df)\n```\n![Quality Control Dataframe.](paper/qc_df.png)\n\n2. Use k-means clustering to identify the optimal number of CAPs from the data using a heuristic\n```python\nfrom neurocaps.analysis import CAP\nfrom neurocaps.utils import PlotDefaults\n\n# Initialize CAP class\ncap_analysis = CAP(parcel_approach=extractor.parcel_approach, groups=None)\n\nplot_kwargs = {**PlotDefaults.get_caps(), \"figsize\": (4, 3), \"step\": 2}\n\n# Find optimal CAPs (2-20) using silhouette method; results are stored\ncap_analysis.get_caps(\n    subject_timeseries=extractor.subject_timeseries,\n    n_clusters=range(2, 21),\n    standardize=True,\n    cluster_selection_method=\"silhouette\",\n    max_iter=500,\n    n_init=10,\n    show_figs=True,\n    **plot_kwargs,\n)\n```\n<img src=\"paper/silhouette_plot.png\" alt=\"Silhouette Score Plot.\" style=\"width:46%; height:auto;\">\n\n3. Compute temporal dynamic metrics for downstream statistical analyses\n```python\n# Calculate temporal fraction of each CAP\nmetric_dict = cap_analysis.calculate_metrics(\n    extractor.subject_timeseries, metrics=[\"temporal_fraction\"]\n)\nprint(metric_dict[\"temporal_fraction\"])\n```\n\n![Temporal Fraction Dataframe.](paper/temporal_fraction_df.png)\n\nNote that CAP-1 is the dominant brain state across subjects (highest frequency).\n\n4. Visualize CAPs\n```python\n# Create surface and radar plots for each CAP\nsurface_kwargs = {**PlotDefaults.caps2surf(), \"layout\": \"row\", \"size\": (500, 100)}\n\nradar_kwargs = {**PlotDefaults.caps2radar(), \"height\": 400, \"width\": 485}\nradar_kwargs[\"radialaxis\"] = {\"range\": [0, 0.4], \"tickvals\": [0.1, \"\", \"\", 0.4]}\nradar_kwargs[\"legend\"] = {\"yanchor\": \"top\", \"y\": 0.75, \"x\": 1.15}\n\ncap_analysis.caps2surf(**surface_kwargs).caps2radar(**radar_kwargs)\n```\n<img src=\"paper/cap_1_surface.png\" alt=\"CAP-1 Surface Image.\" style=\"width:46%; height:auto;\">\n\n<img src=\"paper/cap_2_surface.png\" alt=\"CAP-2 Surface Image.\" style=\"width:46%; height:auto;\">\n\n<img src=\"paper/cap_1_radar.png\" alt=\"CAP-1 Radar Image.\" style=\"width:46%; height:auto;\">\n\n<img src=\"paper/cap_2_radar.png\" alt=\"CAP-2 Radar Image.\" style=\"width:46%; height:auto;\">\n\nRadar plots show network alignment (measured by cosine similarity): \"High Amplitude\" represents\nalignment to activations (> 0), \"Low Amplitude\" represents alignment to deactivations (< 0).\n\nEach CAP can be characterized using either maximum alignment\n(CAP-1: Vis+/SomMot-; CAP-2: SomMot+/Vis-) or predominant alignment (\"High Amplitude\" \u2212 \"Low Amplitude\";\nCAP-1: SalVentAttn+/SomMot-; CAP-2: SomMot+/SalVentAttn-).\n\n```python\nimport pandas as pd\n\nfor cap_name in cap_analysis.caps[\"All Subjects\"]:\n    df = pd.DataFrame(cap_analysis.cosine_similarity[\"All Subjects\"][cap_name])\n    df[\"Net\"] = df[\"High Amplitude\"] - df[\"Low Amplitude\"]\n    df[\"Regions\"] = cap_analysis.cosine_similarity[\"All Subjects\"][\"Regions\"]\n    print(f\"{cap_name}:\", \"\\n\", df, \"\\n\")\n```\nCAP-1:\n\n<img src=\"paper/cap_1_alignment_df.png\" alt=\"CAP-1 Network Alignment Dataframe.\" style=\"width:46%; height:auto;\">\n\nCAP-2:\n\n<img src=\"paper/cap_2_alignment_df.png\" alt=\"CAP-2 Network Alignment Dataframe.\" style=\"width:46%; height:auto;\">\n\n**Note**: For information about logging, refer to [NeuroCAPs' Logging Guide](https://neurocaps.readthedocs.io/en/stable/user_guide/logging.html).\n\n## Citing\nIf you would like to cite NeuroCAPs, you can use:\n\n```\n   Smith, D., (2025). NeuroCAPs: A Python Package for Performing Co-Activation Patterns Analyses on Resting-State and Task-Based fMRI Data. Journal of Open Source Software, 10(112), 8196, https://doi.org/10.21105/joss.08196\n```\n\n## Reporting Issues\nBug reports, feature requests, and documentation enhancements can be reported using the\ntemplates offered when creating a new issue in the\n[issue tracker](https://github.com/donishadsmith/neurocaps/issues).\n\n## Contributing\nPlease refer the [contributing guidelines](https://neurocaps.readthedocs.io/en/stable/contributing.html)\non how to contribute to NeuroCAPs.\n\n## Acknowledgements\nNeuroCAPs relies on several popular data processing, machine learning, neuroimaging, and visualization\n[packages](https://neurocaps.readthedocs.io/en/stable/#dependencies).\n\nAdditionally, some foundational concepts in this package take inspiration from features or design\npatterns implemented in other neuroimaging Python packages, specifically:\n\n- mtorabi59's [pydfc](https://github.com/neurodatascience/dFC), a toolbox that allows comparisons\namong several popular dynamic functionality methods.\n- 62442katieb's [IDConn](https://github.com/62442katieb/IDConn), a pipeline for assessing individual\ndifferences in resting-state or task-based functional connectivity.\n\n## References\n[^1]: Liu, X., Chang, C., & Duyn, J. H. (2013). Decomposition of spontaneous brain activity into\ndistinct fMRI co-activation patterns. Frontiers in Systems Neuroscience, 7.\nhttps://doi.org/10.3389/fnsys.2013.00101\n\n[^2]: Liu, X., Zhang, N., Chang, C., & Duyn, J. H. (2018). Co-activation patterns in resting-state\nfMRI signals. NeuroImage, 180, 485\u2013494. https://doi.org/10.1016/j.neuroimage.2018.01.041\n\n[^3]: Yang, H., Zhang, H., Di, X., Wang, S., Meng, C., Tian, L., & Biswal, B. (2021). Reproducible\ncoactivation patterns of functional brain networks reveal the aberrant dynamic state transition in\nschizophrenia. NeuroImage, 237, 118193. https://doi.org/10.1016/j.neuroimage.2021.118193\n\n[^4]: Zhang, R., Yan, W., Manza, P., Shokri-Kojori, E., Demiral, S. B., Schwandt, M., Vines, L.,\nSotelo, D., Tomasi, D., Giddens, N. T., Wang, G., Diazgranados, N., Momenan, R., & Volkow, N. D. (2023).\nDisrupted brain state dynamics in opioid and alcohol use disorder: attenuation by nicotine use.\nNeuropsychopharmacology, 49(5), 876\u2013884. https://doi.org/10.1038/s41386-023-01750-w\n\n[^5]: Ingwersen, T., Mayer, C., Petersen, M., Frey, B. M., Fiehler, J., Hanning, U., K\u00fchn, S.,\nGallinat, J., Twerenbold, R., Gerloff, C., Cheng, B., Thomalla, G., & Schlemm, E. (2024).\nFunctional MRI brain state occupancy in the presence of cerebral small vessel disease \u2014\nA pre-registered replication analysis of the Hamburg City Health Study. Imaging Neuroscience,\n2, 1\u201317. https://doi.org/10.1162/imag_a_00122\n\n[^6]: Kupis, L., Romero, C., Dirks, B., Hoang, S., Parlad\u00e9, M. V., Beaumont, A. L., Cardona, S. M.,\nAlessandri, M., Chang, C., Nomi, J. S., & Uddin, L. Q. (2020). Evoked and intrinsic brain network\ndynamics in children with autism spectrum disorder. NeuroImage: Clinical, 28, 102396.\nhttps://doi.org/10.1016/j.nicl.2020.102396\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Co-activation Patterns (CAPs) Python package",
    "version": "0.35.2",
    "project_urls": {
        "Changelog": "https://neurocaps.readthedocs.io/en/stable/changelog.html",
        "Github": "https://github.com/donishadsmith/neurocaps",
        "Homepage": "https://neurocaps.readthedocs.io",
        "Issues": "https://github.com/donishadsmith/neurocaps/issues"
    },
    "split_keywords": [
        "python",
        " co-activation patterns",
        " caps",
        " neuroimaging",
        " fmri",
        " dfc",
        " dynamic functional connectivity",
        " fmriprep"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a63da408a828ddcf8ba7a355f21f60f5dd498f81b6bbd946eb6a54a8f4a459eb",
                "md5": "eaf31b18989d8509610f0e5323c38f87",
                "sha256": "615c238e0c83f9c9ba46843cfc5e58c6de831e7cce816d64b6d8ca0828293a88"
            },
            "downloads": -1,
            "filename": "neurocaps-0.35.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "eaf31b18989d8509610f0e5323c38f87",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.13,>=3.9.0",
            "size": 126388,
            "upload_time": "2025-08-15T07:15:17",
            "upload_time_iso_8601": "2025-08-15T07:15:17.090434Z",
            "url": "https://files.pythonhosted.org/packages/a6/3d/a408a828ddcf8ba7a355f21f60f5dd498f81b6bbd946eb6a54a8f4a459eb/neurocaps-0.35.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "af936660fcf4f54e3f572ca806e2d65461a0a99e71b4ce2f03efa5b82e89c7fb",
                "md5": "922e6a80a32e17e7514991f9eac9d607",
                "sha256": "bed49dc29ad69642c6bef6bca7d0461c24799ea7eb478fa37a4db07d883633c6"
            },
            "downloads": -1,
            "filename": "neurocaps-0.35.2.tar.gz",
            "has_sig": false,
            "md5_digest": "922e6a80a32e17e7514991f9eac9d607",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.13,>=3.9.0",
            "size": 116642,
            "upload_time": "2025-08-15T07:15:18",
            "upload_time_iso_8601": "2025-08-15T07:15:18.795836Z",
            "url": "https://files.pythonhosted.org/packages/af/93/6660fcf4f54e3f572ca806e2d65461a0a99e71b4ce2f03efa5b82e89c7fb/neurocaps-0.35.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-15 07:15:18",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "donishadsmith",
    "github_project": "neurocaps",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "neurocaps"
}
        
Elapsed time: 0.56972s