# neurocaps
[](https://pypi.python.org/pypi/neurocaps/)
[](https://pypi.python.org/pypi/neurocaps/)
[](https://doi.org/10.5281/zenodo.14886003)
[](https://github.com/donishadsmith/neurocaps)
[](https://github.com/donishadsmith/neurocaps/actions/workflows/testing.yaml)
[](http://neurocaps.readthedocs.io/en/stable/?badge=stable)
[](https://codecov.io/github/donishadsmith/neurocaps)
[](https://github.com/psf/black)
[](https://opensource.org/licenses/MIT)

[](https://hub.docker.com/r/donishadsmith/neurocaps/tags/)
neurocaps is a Python package for performing Co-activation Patterns (CAPs) analyses on resting-state or task-based fMRI
data (resting-state & task-based). CAPs identifies recurring brain states through k-means clustering of BOLD timeseries
data [^1].
**Note:** neurocaps is most optimized for fMRI data preprocessed with fMRIPrep and assumes the data is BIDs compliant.
Refer to [neurocaps' BIDS Structure and Entities Documentation](https://neurocaps.readthedocs.io/en/stable/bids.html)
for additional information.
## Installation
To install neurocaps, follow the instructions below using your preferred terminal.
### Standard Installation from PyPi
```bash
pip install neurocaps
```
**Windows Users**
To avoid installation errors related to long paths not being enabled, pybids will not be installed by default.
Refer to official [Microsoft documentation](https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=powershell)
to enable long paths.
To include pybids in your installation, use:
```bash
pip install neurocaps[windows]
```
Alternatively, you can install pybids separately:
```bash
pip install pybids
```
### Installation from Source (Development Version)
To install the latest development version from the source, there are two options:
1. Install directly via pip:
```bash
pip install git+https://github.com/donishadsmith/neurocaps.git
```
2. Clone the repository and install locally:
```bash
git clone https://github.com/donishadsmith/neurocaps/
cd neurocaps
pip install -e .
```
**Windows Users**
To include pybids when installing the development version on Windows, use:
```bash
git clone https://github.com/donishadsmith/neurocaps/
cd neurocaps
pip install -e .[windows]
```
## Docker
If [Docker](https://docs.docker.com/) is available on your system, you can use the neurocaps Docker image, which
includes the demos and configures a headless display for VTK.
To pull the Docker image:
```bash
docker pull donishadsmith/neurocaps && docker tag donishadsmith/neurocaps neurocaps
```
The image can be run as:
1. An interactive bash session (default):
```bash
docker run -it neurocaps
```
2. A Jupyter Notebook with port forwarding:
```bash
docker run -it -p 9999:9999 neurocaps notebook
```
## Usage
**Note, documentation of each function can be found in the [API](https://neurocaps.readthedocs.io/en/stable/api.html)
section of the documentation homepage.**
**This package contains two main classes: `TimeseriesExtractor` for extracting the timeseries, and `CAP` for performing the CAPs analysis.**
**Main features for `TimeseriesExtractor` includes:**
- **Timeseries Extraction:** Extract timeseries for resting-state or task data using Schaefer, AAL, or a lateralized Custom parcellation for spatial dimensionality reduction.
- **Parallel Processing:** Use parallel processing to speed up timeseries extraction.
- **Saving Timeseries:** Save the nested dictionary containing timeseries as a pickle file.
- **Visualization:** Visualize the timeseries at the region or node level of the parcellation.
**Main features for `CAP` includes:**
- **Grouping:** Perform CAPs analysis for entire sample or groups of subject IDs.
- **Optimal Cluster Size Identification:** Perform the Davies Bouldin, Silhouette, Elbow, or Variance Ratio criterions to identify the optimal cluster size and automatically save the optimal model as an attribute.
- **Parallel Processing:** Use parallel processing to speed up optimal cluster size identification.
- **CAP Visualization:** Visualize the CAPs as outer products or heatmaps at either the region or node level of the parcellation.
- **Save CAPs as NifTIs:** Convert the atlas used for parcellation to a statistical NifTI image.
- **Surface Plot Visualization:** Project CAPs onto a surface plot.
- **Correlation Matrix Creation:** Create a correlation matrix from CAPs.
- **CAP Metrics Calculation:** Calculate several CAP metrics as described in [Liu et al., 2018](https://doi.org/10.1016/j.neuroimage.2018.01.041)[^1] and [Yang et al., 2021](https://doi.org/10.1016/j.neuroimage.2021.118193)[^2]:
- *Temporal Fraction:* The proportion of total volumes spent in a single CAP over all volumes in a run.
- *Persistence:* The average time spent in a single CAP before transitioning to another CAP
- *Counts:* The total number of initiations of a specific CAP across an entire run. An initiation is defined as the first occurrence of a CAP.
- *Transition Frequency:* The number of transitions between different CAPs across the entire run.
- *Transition Probability:* The probability of transitioning from one CAP to another CAP (or the same CAP). This is calculated as (Number of transitions from A to B)/(Total transitions from A).
- **Cosine Similarity Radar Plots:** Create radar plots showing the cosine similarity between positive and negative
activations of each CAP and each a-priori regions in a parcellation [^3] [^4].
**Additionally, the `neurocaps.analysis` submodule contains additional functions:**
- `merge_dicts`: Merge the subject_timeseries dictionaries for overlapping subjects across tasks to identify similar CAPs across different tasks [^5]. The merged dictionary can be saved as a pickle file.
- `standardize`: Standardizes each run independently for all subjects in the subject timeseries.
- `change_dtype`: Changes the dtype of all subjects in the subject timeseries to help with memory usage.
- `transition_matrix`: Uses the "transition_probability" output from ``CAP.calculate_metrics`` to generate and visualize the averaged transition probability matrix for all groups from the analysis.
Refer to the [demos](https://github.com/donishadsmith/neurocaps/tree/main/demos) or
the [tutorials](https://neurocaps.readthedocs.io/en/stable/examples/examples.html) on the documentation website
for a more extensive demonstration of the features included in this package.
**Demonstration**:
Use dataset from OpenNeuro [^6]:
```python
# Download Sample Dataset from OpenNeuro, requires the openneuro-py package
# [Dataset] doi: doi:10.18112/openneuro.ds005381.v1.0.0
import os
from openneuro import download
demo_dir = "neurocaps_demo"
os.makedirs(demo_dir)
# Include the run-1 and run-2 data from two tasks for two subjects
include = [
"dataset_description.json",
"sub-0004/ses-2/func/*run-[12]*events*",
"sub-0006/ses-2/func/*run-[12]*events*",
"derivatives/fmriprep/sub-0004/fmriprep/sub-0004/ses-2/func/*run-[12]*confounds_timeseries*",
"derivatives/fmriprep/sub-0004/fmriprep/sub-0004/ses-2/func/*run-[12]_space-MNI152NLin*preproc_bold*",
"derivatives/fmriprep/sub-0004/fmriprep/sub-0004/ses-2/func/*run-[12]_space-MNI152NLin*brain_mask*",
"derivatives/fmriprep/sub-0006/fmriprep/sub-0006/ses-2/func/*run-[12]*confounds_timeseries*",
"derivatives/fmriprep/sub-0006/fmriprep/sub-0006/ses-2/func/*run-[12]_space-MNI152NLin*preproc_bold*",
"derivatives/fmriprep/sub-0006/fmriprep/sub-0006/ses-2/func/*run-[12]_space-MNI152NLin*brain_mask*",
]
download(dataset="ds005381", include=include, target_dir=demo_dir, verify_hash=False)
# Create a "dataset_description" file for the pipeline folder if needed
import json
desc = {
"Name": "fMRIPrep - fMRI PREProcessing workflow",
"BIDSVersion": "1.0.0",
"DatasetType": "derivative",
"GeneratedBy": [{"Name": "fMRIPrep", "Version": "20.2.0", "CodeURL": "https://github.com/nipreps/fmriprep"}],
}
with open("neurocaps_demo/derivatives/fmriprep/dataset_description.json", "w", encoding="utf-8") as f:
json.dump(desc, f)
```
```python
from neurocaps.extraction import TimeseriesExtractor
from neurocaps.analysis import CAP
# Set specific confounds for nuisance regression
confounds = ["cosine*", "trans_x", "trans_y", "trans_z", "rot_x", "rot_y", "rot_z"]
# Set parcellation
parcel_approach = {"Schaefer": {"n_rois": 100, "yeo_networks": 7, "resolution_mm": 2}}
# Initialize TimeseriesExtractor
extractor = TimeseriesExtractor(
space="MNI152NLin6Asym",
parcel_approach=parcel_approach,
standardize="zscore_sample",
use_confounds=True,
detrend=True,
low_pass=0.1,
high_pass=None,
n_acompcor_separate=2, # 2 acompcor from WM and CSF masks = 4 total
confound_names=confounds,
fd_threshold={"threshold": 0.35, "outlier_percentage": 0.20, "n_before": 2, "n_after": 1, "use_sample_mask": True},
)
# Extract timeseries for subjects in the BIDS directory; Subject 0006 run-1 will be flagged and skipped
extractor.get_bold(
bids_dir="neurocaps_demo",
pipeline_name="fmriprep", # Can specify if multiple pipelines exists in derivatives directory
task="DET",
condition="late", # Can extract a specific condition if events.tsv is available
condition_tr_shift=2, # Account for hemodynamic lag
slice_time_ref=1, # Adjust if last volume used as slice time reference when extracting condition
session="2",
n_cores=None,
verbose=True,
progress_bar=False, # Parameter available in version >=0.21.5
).timeseries_to_pickle("neurocaps_demo/derivatives", "timeseries.pkl")
```
**Output:**
```
2025-02-17 13:34:09,870 neurocaps._utils.extraction.check_confound_names [INFO] Confound regressors to be used if available: cosine*, trans_x, trans_y, trans_z, rot_x, rot_y, rot_z.
2025-02-17 13:34:12,689 neurocaps.extraction.timeseriesextractor [INFO] BIDS Layout: ...rs\donis\Github\neurocaps_demo | Subjects: 2 | Sessions: 2 | Runs: 4
2025-02-17 13:34:13,005 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 1] Preparing for Timeseries Extraction using [FILE: sub-0004_ses-2_task-DET_run-1_space-MNI152NLin6Asym_res-2_desc-preproc_bold.nii.gz].
2025-02-17 13:34:13,038 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 1] The following confounds will be used for nuisance regression: cosine00, cosine01, cosine02, cosine03, trans_x, trans_y, trans_z, rot_x, rot_y, rot_z, a_comp_cor_00, a_comp_cor_01, a_comp_cor_33, a_comp_cor_34.
2025-02-17 13:34:26,421 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 1] Nuisance regression completed; extracting [CONDITION: late].
2025-02-17 13:34:26,469 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 2] Preparing for Timeseries Extraction using [FILE: sub-0004_ses-2_task-DET_run-2_space-MNI152NLin6Asym_res-2_desc-preproc_bold.nii.gz].
2025-02-17 13:34:26,488 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 2] The following confounds will be used for nuisance regression: cosine00, cosine01, cosine02, cosine03, trans_x, trans_y, trans_z, rot_x, rot_y, rot_z, a_comp_cor_00, a_comp_cor_01, a_comp_cor_100, a_comp_cor_101.
2025-02-17 13:34:40,285 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 2] Nuisance regression completed; extracting [CONDITION: late].
2025-02-17 13:34:40,334 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0006 | SESSION: 2 | TASK: DET | RUN: 1] Preparing for Timeseries Extraction using [FILE: sub-0006_ses-2_task-DET_run-1_space-MNI152NLin6Asym_res-2_desc-preproc_bold.nii.gz].
2025-02-17 13:34:40,353 neurocaps._utils.extraction.extract_timeseries [WARNING] [SUBJECT: 0006 | SESSION: 2 | TASK: DET | RUN: 1] Timeseries Extraction Skipped: Run flagged due to more than 20.0% of the volumes exceeding the framewise displacement threshold of 0.35. Percentage of volumes exceeding the threshold limit is 21.62162162162162% for [CONDITION: late].
2025-02-17 13:34:40,353 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0006 | SESSION: 2 | TASK: DET | RUN: 2] Preparing for Timeseries Extraction using [FILE: sub-0006_ses-2_task-DET_run-2_space-MNI152NLin6Asym_res-2_desc-preproc_bold.nii.gz].
2025-02-17 13:34:40,370 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0006 | SESSION: 2 | TASK: DET | RUN: 2] The following confounds will be used for nuisance regression: cosine00, cosine01, cosine02, cosine03, trans_x, trans_y, trans_z, rot_x, rot_y, rot_z, a_comp_cor_00, a_comp_cor_01, a_comp_cor_24, a_comp_cor_25.
2025-02-17 13:34:53,680 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0006 | SESSION: 2 | TASK: DET | RUN: 2] Nuisance regression completed; extracting [CONDITION: late].
```
**Note:** Refer to [neurocaps' Logging Documentation](https://neurocaps.readthedocs.io/en/stable/logging.html) for
additional information about logging.
```python
# Initialize CAP class
cap_analysis = CAP(parcel_approach=extractor.parcel_approach)
# Pkl files can also be used as input for `subject_timeseries`; only 2 clusters for simplicity
cap_analysis.get_caps(subject_timeseries=extractor.subject_timeseries, n_clusters=2, standardize=True)
# `sharey` only applicable to outer product plots
kwargs = {
"sharey": True,
"ncol": 3,
"subplots": True,
"cmap": "coolwarm",
"xticklabels_size": 10,
"yticklabels_size": 10,
"xlabel_rotation": 90,
"cbarlabels_size": 10,
}
# Outer Product
cap_analysis.caps2plot(visual_scope="regions", plot_options=["outer_product"], suffix_title="DET Task - late", **kwargs)
# Heatmap
kwargs["xlabel_rotation"] = 0
cap_analysis.caps2plot(visual_scope="regions", plot_options=["heatmap"], suffix_title="DET Task - late", **kwargs)
```
**Plot Outputs:**
<img src="assets/outerproduct.png" width=70% height=70%>
<img src="assets/heatmap.png" width=70% height=70%>
```python
# Get CAP metrics
outputs = cap_analysis.calculate_metrics(
subject_timeseries=extractor.subject_timeseries,
tr=2.0, # TR to convert persistence to time units
return_df=True,
metrics=["temporal_fraction", "persistence"],
continuous_runs=True,
progress_bar=False,
)
# Subject 0006 only has run-2 data since run-1 was flagged during timeseries extraction
print(outputs["temporal_fraction"])
```
**DataFrame Output:**
| Subject_ID | Group | Run | CAP-1 | CAP-2 |
| --- | --- | --- | --- | --- |
| 0004 | All Subjects | run-continuous | 0.344262 | 0.655738 |
| 0006 | All Subjects | run-2 | 0.366667 | 0.633333 |
```python
# Create surface plots
kwargs = {
"cmap": "cold_hot",
"layout": "row",
"size": (500, 200),
"zoom": 1,
"cbar_kws": {"location": "bottom"},
"color_range": (-1, 1),
}
cap_analysis.caps2surf(progress_bar=False, **kwargs)
```
**Plot Outputs:**
<img src="assets/cap1.png" width=70% height=70%>
<img src="assets/cap2.png" width=70% height=70%>
```python
# Create Pearson correlation matrix
kwargs = {"annot": True, "cmap": "viridis", "xticklabels_size": 10, "yticklabels_size": 10, "cbarlabels_size": 10}
cap_analysis.caps2corr(**kwargs)
```
**Plot Output:**
<img src="assets/correlation.png" width=70% height=70%>
```python
# Create radar plots showing cosine similarity between region/networks and caps
radialaxis = {
"showline": True,
"linewidth": 2,
"linecolor": "rgba(0, 0, 0, 0.25)",
"gridcolor": "rgba(0, 0, 0, 0.25)",
"ticks": "outside",
"tickfont": {"size": 14, "color": "black"},
"range": [0, 0.6],
"tickvals": [0.1, "", "", 0.4, "", "", 0.6],
}
legend = {
"yanchor": "top",
"y": 0.99,
"x": 0.99,
"title_font_family": "Times New Roman",
"font": {"size": 12, "color": "black"},
}
colors = {"High Amplitude": "black", "Low Amplitude": "orange"}
kwargs = {
"radialaxis": radialaxis,
"fill": "toself",
"legend": legend,
"color_discrete_map": colors,
"height": 400,
"width": 600,
}
cap_analysis.caps2radar(**kwargs)
```
**Plot Outputs:**
<img src="assets/cap1radar.png" width=70% height=70%>
<img src="assets/cap2radar.png" width=70% height=70%>
```python
# Get transition probabilities for all participants in a dataframe, then convert to an averaged matrix
from neurocaps.analysis import transition_matrix
# Optimal cluster sizes are saved automatically
cap_analysis.get_caps(
subject_timeseries=extractor.subject_timeseries,
cluster_selection_method="silhouette",
standardize=True,
show_figs=True,
n_clusters=range(2, 6),
progress_bar=True,
)
outputs = cap_analysis.calculate_metrics(
subject_timeseries=extractor.subject_timeseries,
return_df=True,
metrics=["transition_probability"],
continuous_runs=True,
progress_bar=False,
)
print(outputs["transition_probability"]["All Subjects"])
kwargs = {
"cmap": "Blues",
"fmt": ".3f",
"annot": True,
"vmin": 0,
"vmax": 1,
"xticklabels_size": 10,
"yticklabels_size": 10,
"cbarlabels_size": 10,
}
trans_outputs = transition_matrix(
trans_dict=outputs["transition_probability"], show_figs=True, return_df=True, **kwargs
)
print(trans_outputs["All Subjects"])
```
**Outputs:**
```
Clustering [GROUP: All Subjects]: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4/4 [00:00<00:00, 22.42it/s]
2025-02-17 13:34:58,392 neurocaps.analysis.cap [INFO] [GROUP: All Subjects | METHOD: silhouette] Optimal cluster size is 2.
```
<img src="assets/silhouette.png" width=70% height=70%>
| Subject_ID | Group | Run | 1.1 | 1.2 | 2.1 | 2.2 |
| --- | --- | --- | --- | --- | --- | --- |
| 0004 | All Subjects | run-continuous | 0.743590 | 0.256410 | 0.523810 | 0.476190 |
| 0006 | All Subjects | run-2 | 0.722222 | 0.277778 | 0.454545 | 0.545455 |
<img src="assets/transprob.png" width=70% height=70%>
| From/To | CAP-1 | CAP-2 |
| --- | --- | --- |
| CAP-1 | 0.732906 | 0.267094 |
| CAP-2 | 0.489177 | 0.510823 |
## Acknowledgements
Neurocaps relies on several popular data processing, machine learning, neuroimaging, and visualization
[packages](https://neurocaps.readthedocs.io/en/stable/#dependencies).
Additionally, some foundational concepts in this package take inspiration from features or design patterns implemented
in other neuroimaging Python packages, specically:
- mtorabi59's [pydfc](https://github.com/neurodatascience/dFC), a toolbox that allows comparisons among several popular
dynamic functionality methods.
- 62442katieb's [idconn](https://github.com/62442katieb/IDConn), a pipeline for assessing individual differences in
resting-state or task-based functional connectivity.
## Contributing
Please refer the [contributing guidelines](https://github.com/donishadsmith/neurocaps/blob/test/CONTRIBUTING.md) on how to contribute to neurocaps.
## References
[^1]: Liu, X., Zhang, N., Chang, C., & Duyn, J. H. (2018). Co-activation patterns in resting-state fMRI signals. NeuroImage, 180, 485–494. https://doi.org/10.1016/j.neuroimage.2018.01.041
[^2]: Yang, H., Zhang, H., Di, X., Wang, S., Meng, C., Tian, L., & Biswal, B. (2021). Reproducible coactivation patterns of functional brain networks reveal the aberrant dynamic state transition in schizophrenia. NeuroImage, 237, 118193. https://doi.org/10.1016/j.neuroimage.2021.118193
[^3]: Zhang, R., Yan, W., Manza, P., Shokri-Kojori, E., Demiral, S. B., Schwandt, M., Vines, L., Sotelo, D., Tomasi, D., Giddens, N. T., Wang, G., Diazgranados, N., Momenan, R., & Volkow, N. D. (2023).
Disrupted brain state dynamics in opioid and alcohol use disorder: attenuation by nicotine use. Neuropsychopharmacology, 49(5), 876–884. https://doi.org/10.1038/s41386-023-01750-w
[^4]: Ingwersen, T., Mayer, C., Petersen, M., Frey, B. M., Fiehler, J., Hanning, U., Kühn, S., Gallinat, J., Twerenbold, R., Gerloff, C., Cheng, B., Thomalla, G., & Schlemm, E. (2024). Functional MRI brain state occupancy in the presence of cerebral small vessel disease — A pre-registered replication analysis of the Hamburg City Health Study. Imaging Neuroscience, 2, 1–17. https://doi.org/10.1162/imag_a_00122
[^5]: Kupis, L., Romero, C., Dirks, B., Hoang, S., Parladé, M. V., Beaumont, A. L., Cardona, S. M., Alessandri, M., Chang, C., Nomi, J. S., & Uddin, L. Q. (2020). Evoked and intrinsic brain network dynamics in children with autism spectrum disorder. NeuroImage: Clinical, 28, 102396. https://doi.org/10.1016/j.nicl.2020.102396
[^6]: Hyunwoo Gu and Joonwon Lee and Sungje Kim and Jaeseob Lim and Hyang-Jung Lee and Heeseung Lee and Minjin Choe and Dong-Gyu Yoo and Jun Hwan (Joshua) Ryu and Sukbin Lim and Sang-Hun Lee (2024). Discrimination-Estimation Task. OpenNeuro. [Dataset] doi: https://doi.org/10.18112/openneuro.ds005381.v1.0.0
Raw data
{
"_id": null,
"home_page": null,
"name": "neurocaps",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9.0",
"maintainer_email": null,
"keywords": "python, Co-Activation Patterns, CAPs, neuroimaging, fmri, dfc, dynamic functional connectivity, fMRIPrep",
"author": null,
"author_email": "Donisha Smith <donishasmith@outlook.com>",
"download_url": "https://files.pythonhosted.org/packages/d4/69/8c9d3f42a35159285074ef0e90bf0f74d10a4ff647b09b59a4849273f0d8/neurocaps-0.22.2.tar.gz",
"platform": null,
"description": "# neurocaps\r\n[](https://pypi.python.org/pypi/neurocaps/)\r\n[](https://pypi.python.org/pypi/neurocaps/)\r\n[](https://doi.org/10.5281/zenodo.14886003)\r\n[](https://github.com/donishadsmith/neurocaps)\r\n[](https://github.com/donishadsmith/neurocaps/actions/workflows/testing.yaml)\r\n[](http://neurocaps.readthedocs.io/en/stable/?badge=stable)\r\n[](https://codecov.io/github/donishadsmith/neurocaps)\r\n[](https://github.com/psf/black)\r\n[](https://opensource.org/licenses/MIT)\r\n\r\n[](https://hub.docker.com/r/donishadsmith/neurocaps/tags/)\r\n\r\nneurocaps is a Python package for performing Co-activation Patterns (CAPs) analyses on resting-state or task-based fMRI\r\ndata (resting-state & task-based). CAPs identifies recurring brain states through k-means clustering of BOLD timeseries\r\ndata [^1].\r\n\r\n**Note:** neurocaps is most optimized for fMRI data preprocessed with fMRIPrep and assumes the data is BIDs compliant.\r\nRefer to [neurocaps' BIDS Structure and Entities Documentation](https://neurocaps.readthedocs.io/en/stable/bids.html)\r\nfor additional information.\r\n\r\n## Installation\r\nTo install neurocaps, follow the instructions below using your preferred terminal.\r\n\r\n### Standard Installation from PyPi\r\n```bash\r\n\r\npip install neurocaps\r\n\r\n```\r\n\r\n**Windows Users**\r\n\r\nTo avoid installation errors related to long paths not being enabled, pybids will not be installed by default.\r\nRefer to official [Microsoft documentation](https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=powershell)\r\nto enable long paths.\r\n\r\nTo include pybids in your installation, use:\r\n\r\n```bash\r\n\r\npip install neurocaps[windows]\r\n\r\n```\r\n\r\nAlternatively, you can install pybids separately:\r\n\r\n```bash\r\n\r\npip install pybids\r\n\r\n```\r\n### Installation from Source (Development Version)\r\nTo install the latest development version from the source, there are two options:\r\n\r\n1. Install directly via pip:\r\n```bash\r\n\r\npip install git+https://github.com/donishadsmith/neurocaps.git\r\n\r\n```\r\n\r\n2. Clone the repository and install locally:\r\n\r\n```bash\r\n\r\ngit clone https://github.com/donishadsmith/neurocaps/\r\ncd neurocaps\r\npip install -e .\r\n\r\n```\r\n**Windows Users**\r\n\r\nTo include pybids when installing the development version on Windows, use:\r\n\r\n```bash\r\n\r\ngit clone https://github.com/donishadsmith/neurocaps/\r\ncd neurocaps\r\npip install -e .[windows]\r\n```\r\n\r\n## Docker\r\n\r\nIf [Docker](https://docs.docker.com/) is available on your system, you can use the neurocaps Docker image, which\r\nincludes the demos and configures a headless display for VTK.\r\n\r\nTo pull the Docker image:\r\n```bash\r\n\r\ndocker pull donishadsmith/neurocaps && docker tag donishadsmith/neurocaps neurocaps\r\n```\r\n\r\nThe image can be run as:\r\n\r\n1. An interactive bash session (default):\r\n\r\n```bash\r\n\r\ndocker run -it neurocaps\r\n```\r\n\r\n2. A Jupyter Notebook with port forwarding:\r\n\r\n```bash\r\n\r\ndocker run -it -p 9999:9999 neurocaps notebook\r\n```\r\n\r\n## Usage\r\n**Note, documentation of each function can be found in the [API](https://neurocaps.readthedocs.io/en/stable/api.html)\r\nsection of the documentation homepage.**\r\n\r\n**This package contains two main classes: `TimeseriesExtractor` for extracting the timeseries, and `CAP` for performing the CAPs analysis.**\r\n\r\n**Main features for `TimeseriesExtractor` includes:**\r\n- **Timeseries Extraction:** Extract timeseries for resting-state or task data using Schaefer, AAL, or a lateralized Custom parcellation for spatial dimensionality reduction.\r\n- **Parallel Processing:** Use parallel processing to speed up timeseries extraction.\r\n- **Saving Timeseries:** Save the nested dictionary containing timeseries as a pickle file.\r\n- **Visualization:** Visualize the timeseries at the region or node level of the parcellation.\r\n\r\n**Main features for `CAP` includes:**\r\n- **Grouping:** Perform CAPs analysis for entire sample or groups of subject IDs.\r\n- **Optimal Cluster Size Identification:** Perform the Davies Bouldin, Silhouette, Elbow, or Variance Ratio criterions to identify the optimal cluster size and automatically save the optimal model as an attribute.\r\n- **Parallel Processing:** Use parallel processing to speed up optimal cluster size identification.\r\n- **CAP Visualization:** Visualize the CAPs as outer products or heatmaps at either the region or node level of the parcellation.\r\n- **Save CAPs as NifTIs:** Convert the atlas used for parcellation to a statistical NifTI image.\r\n- **Surface Plot Visualization:** Project CAPs onto a surface plot.\r\n- **Correlation Matrix Creation:** Create a correlation matrix from CAPs.\r\n- **CAP Metrics Calculation:** Calculate several CAP metrics as described in [Liu et al., 2018](https://doi.org/10.1016/j.neuroimage.2018.01.041)[^1] and [Yang et al., 2021](https://doi.org/10.1016/j.neuroimage.2021.118193)[^2]:\r\n - *Temporal Fraction:* The proportion of total volumes spent in a single CAP over all volumes in a run.\r\n - *Persistence:* The average time spent in a single CAP before transitioning to another CAP\r\n - *Counts:* The total number of initiations of a specific CAP across an entire run. An initiation is defined as the first occurrence of a CAP.\r\n - *Transition Frequency:* The number of transitions between different CAPs across the entire run.\r\n - *Transition Probability:* The probability of transitioning from one CAP to another CAP (or the same CAP). This is calculated as (Number of transitions from A to B)/(Total transitions from A).\r\n- **Cosine Similarity Radar Plots:** Create radar plots showing the cosine similarity between positive and negative\r\nactivations of each CAP and each a-priori regions in a parcellation [^3] [^4].\r\n\r\n**Additionally, the `neurocaps.analysis` submodule contains additional functions:**\r\n\r\n- `merge_dicts`: Merge the subject_timeseries dictionaries for overlapping subjects across tasks to identify similar CAPs across different tasks [^5]. The merged dictionary can be saved as a pickle file.\r\n- `standardize`: Standardizes each run independently for all subjects in the subject timeseries.\r\n- `change_dtype`: Changes the dtype of all subjects in the subject timeseries to help with memory usage.\r\n- `transition_matrix`: Uses the \"transition_probability\" output from ``CAP.calculate_metrics`` to generate and visualize the averaged transition probability matrix for all groups from the analysis.\r\n\r\nRefer to the [demos](https://github.com/donishadsmith/neurocaps/tree/main/demos) or\r\nthe [tutorials](https://neurocaps.readthedocs.io/en/stable/examples/examples.html) on the documentation website\r\nfor a more extensive demonstration of the features included in this package.\r\n\r\n**Demonstration**:\r\n\r\nUse dataset from OpenNeuro [^6]:\r\n```python\r\n# Download Sample Dataset from OpenNeuro, requires the openneuro-py package\r\n# [Dataset] doi: doi:10.18112/openneuro.ds005381.v1.0.0\r\nimport os\r\nfrom openneuro import download\r\n\r\ndemo_dir = \"neurocaps_demo\"\r\nos.makedirs(demo_dir)\r\n\r\n# Include the run-1 and run-2 data from two tasks for two subjects\r\ninclude = [\r\n \"dataset_description.json\",\r\n \"sub-0004/ses-2/func/*run-[12]*events*\",\r\n \"sub-0006/ses-2/func/*run-[12]*events*\",\r\n \"derivatives/fmriprep/sub-0004/fmriprep/sub-0004/ses-2/func/*run-[12]*confounds_timeseries*\",\r\n \"derivatives/fmriprep/sub-0004/fmriprep/sub-0004/ses-2/func/*run-[12]_space-MNI152NLin*preproc_bold*\",\r\n \"derivatives/fmriprep/sub-0004/fmriprep/sub-0004/ses-2/func/*run-[12]_space-MNI152NLin*brain_mask*\",\r\n \"derivatives/fmriprep/sub-0006/fmriprep/sub-0006/ses-2/func/*run-[12]*confounds_timeseries*\",\r\n \"derivatives/fmriprep/sub-0006/fmriprep/sub-0006/ses-2/func/*run-[12]_space-MNI152NLin*preproc_bold*\",\r\n \"derivatives/fmriprep/sub-0006/fmriprep/sub-0006/ses-2/func/*run-[12]_space-MNI152NLin*brain_mask*\",\r\n]\r\n\r\ndownload(dataset=\"ds005381\", include=include, target_dir=demo_dir, verify_hash=False)\r\n\r\n# Create a \"dataset_description\" file for the pipeline folder if needed\r\nimport json\r\n\r\ndesc = {\r\n \"Name\": \"fMRIPrep - fMRI PREProcessing workflow\",\r\n \"BIDSVersion\": \"1.0.0\",\r\n \"DatasetType\": \"derivative\",\r\n \"GeneratedBy\": [{\"Name\": \"fMRIPrep\", \"Version\": \"20.2.0\", \"CodeURL\": \"https://github.com/nipreps/fmriprep\"}],\r\n}\r\n\r\nwith open(\"neurocaps_demo/derivatives/fmriprep/dataset_description.json\", \"w\", encoding=\"utf-8\") as f:\r\n json.dump(desc, f)\r\n```\r\n\r\n```python\r\nfrom neurocaps.extraction import TimeseriesExtractor\r\nfrom neurocaps.analysis import CAP\r\n\r\n# Set specific confounds for nuisance regression\r\nconfounds = [\"cosine*\", \"trans_x\", \"trans_y\", \"trans_z\", \"rot_x\", \"rot_y\", \"rot_z\"]\r\n\r\n# Set parcellation\r\nparcel_approach = {\"Schaefer\": {\"n_rois\": 100, \"yeo_networks\": 7, \"resolution_mm\": 2}}\r\n\r\n# Initialize TimeseriesExtractor\r\nextractor = TimeseriesExtractor(\r\n space=\"MNI152NLin6Asym\",\r\n parcel_approach=parcel_approach,\r\n standardize=\"zscore_sample\",\r\n use_confounds=True,\r\n detrend=True,\r\n low_pass=0.1,\r\n high_pass=None,\r\n n_acompcor_separate=2, # 2 acompcor from WM and CSF masks = 4 total\r\n confound_names=confounds,\r\n fd_threshold={\"threshold\": 0.35, \"outlier_percentage\": 0.20, \"n_before\": 2, \"n_after\": 1, \"use_sample_mask\": True},\r\n)\r\n\r\n# Extract timeseries for subjects in the BIDS directory; Subject 0006 run-1 will be flagged and skipped\r\nextractor.get_bold(\r\n bids_dir=\"neurocaps_demo\",\r\n pipeline_name=\"fmriprep\", # Can specify if multiple pipelines exists in derivatives directory\r\n task=\"DET\",\r\n condition=\"late\", # Can extract a specific condition if events.tsv is available\r\n condition_tr_shift=2, # Account for hemodynamic lag\r\n slice_time_ref=1, # Adjust if last volume used as slice time reference when extracting condition\r\n session=\"2\",\r\n n_cores=None,\r\n verbose=True,\r\n progress_bar=False, # Parameter available in version >=0.21.5\r\n).timeseries_to_pickle(\"neurocaps_demo/derivatives\", \"timeseries.pkl\")\r\n```\r\n**Output:**\r\n```\r\n2025-02-17 13:34:09,870 neurocaps._utils.extraction.check_confound_names [INFO] Confound regressors to be used if available: cosine*, trans_x, trans_y, trans_z, rot_x, rot_y, rot_z.\r\n2025-02-17 13:34:12,689 neurocaps.extraction.timeseriesextractor [INFO] BIDS Layout: ...rs\\donis\\Github\\neurocaps_demo | Subjects: 2 | Sessions: 2 | Runs: 4\r\n2025-02-17 13:34:13,005 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 1] Preparing for Timeseries Extraction using [FILE: sub-0004_ses-2_task-DET_run-1_space-MNI152NLin6Asym_res-2_desc-preproc_bold.nii.gz].\r\n2025-02-17 13:34:13,038 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 1] The following confounds will be used for nuisance regression: cosine00, cosine01, cosine02, cosine03, trans_x, trans_y, trans_z, rot_x, rot_y, rot_z, a_comp_cor_00, a_comp_cor_01, a_comp_cor_33, a_comp_cor_34.\r\n2025-02-17 13:34:26,421 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 1] Nuisance regression completed; extracting [CONDITION: late].\r\n2025-02-17 13:34:26,469 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 2] Preparing for Timeseries Extraction using [FILE: sub-0004_ses-2_task-DET_run-2_space-MNI152NLin6Asym_res-2_desc-preproc_bold.nii.gz].\r\n2025-02-17 13:34:26,488 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 2] The following confounds will be used for nuisance regression: cosine00, cosine01, cosine02, cosine03, trans_x, trans_y, trans_z, rot_x, rot_y, rot_z, a_comp_cor_00, a_comp_cor_01, a_comp_cor_100, a_comp_cor_101.\r\n2025-02-17 13:34:40,285 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0004 | SESSION: 2 | TASK: DET | RUN: 2] Nuisance regression completed; extracting [CONDITION: late].\r\n2025-02-17 13:34:40,334 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0006 | SESSION: 2 | TASK: DET | RUN: 1] Preparing for Timeseries Extraction using [FILE: sub-0006_ses-2_task-DET_run-1_space-MNI152NLin6Asym_res-2_desc-preproc_bold.nii.gz].\r\n2025-02-17 13:34:40,353 neurocaps._utils.extraction.extract_timeseries [WARNING] [SUBJECT: 0006 | SESSION: 2 | TASK: DET | RUN: 1] Timeseries Extraction Skipped: Run flagged due to more than 20.0% of the volumes exceeding the framewise displacement threshold of 0.35. Percentage of volumes exceeding the threshold limit is 21.62162162162162% for [CONDITION: late].\r\n2025-02-17 13:34:40,353 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0006 | SESSION: 2 | TASK: DET | RUN: 2] Preparing for Timeseries Extraction using [FILE: sub-0006_ses-2_task-DET_run-2_space-MNI152NLin6Asym_res-2_desc-preproc_bold.nii.gz].\r\n2025-02-17 13:34:40,370 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0006 | SESSION: 2 | TASK: DET | RUN: 2] The following confounds will be used for nuisance regression: cosine00, cosine01, cosine02, cosine03, trans_x, trans_y, trans_z, rot_x, rot_y, rot_z, a_comp_cor_00, a_comp_cor_01, a_comp_cor_24, a_comp_cor_25.\r\n2025-02-17 13:34:53,680 neurocaps._utils.extraction.extract_timeseries [INFO] [SUBJECT: 0006 | SESSION: 2 | TASK: DET | RUN: 2] Nuisance regression completed; extracting [CONDITION: late].\r\n```\r\n\r\n**Note:** Refer to [neurocaps' Logging Documentation](https://neurocaps.readthedocs.io/en/stable/logging.html) for\r\nadditional information about logging.\r\n\r\n```python\r\n# Initialize CAP class\r\ncap_analysis = CAP(parcel_approach=extractor.parcel_approach)\r\n\r\n# Pkl files can also be used as input for `subject_timeseries`; only 2 clusters for simplicity\r\ncap_analysis.get_caps(subject_timeseries=extractor.subject_timeseries, n_clusters=2, standardize=True)\r\n\r\n# `sharey` only applicable to outer product plots\r\nkwargs = {\r\n \"sharey\": True,\r\n \"ncol\": 3,\r\n \"subplots\": True,\r\n \"cmap\": \"coolwarm\",\r\n \"xticklabels_size\": 10,\r\n \"yticklabels_size\": 10,\r\n \"xlabel_rotation\": 90,\r\n \"cbarlabels_size\": 10,\r\n}\r\n\r\n# Outer Product\r\ncap_analysis.caps2plot(visual_scope=\"regions\", plot_options=[\"outer_product\"], suffix_title=\"DET Task - late\", **kwargs)\r\n\r\n# Heatmap\r\nkwargs[\"xlabel_rotation\"] = 0\r\n\r\ncap_analysis.caps2plot(visual_scope=\"regions\", plot_options=[\"heatmap\"], suffix_title=\"DET Task - late\", **kwargs)\r\n```\r\n**Plot Outputs:**\r\n\r\n<img src=\"assets/outerproduct.png\" width=70% height=70%>\r\n<img src=\"assets/heatmap.png\" width=70% height=70%>\r\n\r\n```python\r\n# Get CAP metrics\r\noutputs = cap_analysis.calculate_metrics(\r\n subject_timeseries=extractor.subject_timeseries,\r\n tr=2.0, # TR to convert persistence to time units\r\n return_df=True,\r\n metrics=[\"temporal_fraction\", \"persistence\"],\r\n continuous_runs=True,\r\n progress_bar=False,\r\n)\r\n\r\n# Subject 0006 only has run-2 data since run-1 was flagged during timeseries extraction\r\nprint(outputs[\"temporal_fraction\"])\r\n```\r\n**DataFrame Output:**\r\n| Subject_ID | Group | Run | CAP-1 | CAP-2 |\r\n| --- | --- | --- | --- | --- |\r\n| 0004 | All Subjects | run-continuous | 0.344262 | 0.655738 |\r\n| 0006 | All Subjects | run-2 | 0.366667 | 0.633333 |\r\n\r\n```python\r\n# Create surface plots\r\nkwargs = {\r\n \"cmap\": \"cold_hot\",\r\n \"layout\": \"row\",\r\n \"size\": (500, 200),\r\n \"zoom\": 1,\r\n \"cbar_kws\": {\"location\": \"bottom\"},\r\n \"color_range\": (-1, 1),\r\n}\r\n\r\ncap_analysis.caps2surf(progress_bar=False, **kwargs)\r\n```\r\n**Plot Outputs:**\r\n\r\n<img src=\"assets/cap1.png\" width=70% height=70%>\r\n<img src=\"assets/cap2.png\" width=70% height=70%>\r\n\r\n```python\r\n# Create Pearson correlation matrix\r\nkwargs = {\"annot\": True, \"cmap\": \"viridis\", \"xticklabels_size\": 10, \"yticklabels_size\": 10, \"cbarlabels_size\": 10}\r\n\r\ncap_analysis.caps2corr(**kwargs)\r\n```\r\n**Plot Output:**\r\n\r\n<img src=\"assets/correlation.png\" width=70% height=70%>\r\n\r\n```python\r\n# Create radar plots showing cosine similarity between region/networks and caps\r\nradialaxis = {\r\n \"showline\": True,\r\n \"linewidth\": 2,\r\n \"linecolor\": \"rgba(0, 0, 0, 0.25)\",\r\n \"gridcolor\": \"rgba(0, 0, 0, 0.25)\",\r\n \"ticks\": \"outside\",\r\n \"tickfont\": {\"size\": 14, \"color\": \"black\"},\r\n \"range\": [0, 0.6],\r\n \"tickvals\": [0.1, \"\", \"\", 0.4, \"\", \"\", 0.6],\r\n}\r\n\r\nlegend = {\r\n \"yanchor\": \"top\",\r\n \"y\": 0.99,\r\n \"x\": 0.99,\r\n \"title_font_family\": \"Times New Roman\",\r\n \"font\": {\"size\": 12, \"color\": \"black\"},\r\n}\r\n\r\ncolors = {\"High Amplitude\": \"black\", \"Low Amplitude\": \"orange\"}\r\n\r\nkwargs = {\r\n \"radialaxis\": radialaxis,\r\n \"fill\": \"toself\",\r\n \"legend\": legend,\r\n \"color_discrete_map\": colors,\r\n \"height\": 400,\r\n \"width\": 600,\r\n}\r\n\r\ncap_analysis.caps2radar(**kwargs)\r\n```\r\n**Plot Outputs:**\r\n\r\n<img src=\"assets/cap1radar.png\" width=70% height=70%>\r\n<img src=\"assets/cap2radar.png\" width=70% height=70%>\r\n\r\n```python\r\n# Get transition probabilities for all participants in a dataframe, then convert to an averaged matrix\r\nfrom neurocaps.analysis import transition_matrix\r\n\r\n# Optimal cluster sizes are saved automatically\r\ncap_analysis.get_caps(\r\n subject_timeseries=extractor.subject_timeseries,\r\n cluster_selection_method=\"silhouette\",\r\n standardize=True,\r\n show_figs=True,\r\n n_clusters=range(2, 6),\r\n progress_bar=True,\r\n)\r\n\r\noutputs = cap_analysis.calculate_metrics(\r\n subject_timeseries=extractor.subject_timeseries,\r\n return_df=True,\r\n metrics=[\"transition_probability\"],\r\n continuous_runs=True,\r\n progress_bar=False,\r\n)\r\n\r\nprint(outputs[\"transition_probability\"][\"All Subjects\"])\r\n\r\nkwargs = {\r\n \"cmap\": \"Blues\",\r\n \"fmt\": \".3f\",\r\n \"annot\": True,\r\n \"vmin\": 0,\r\n \"vmax\": 1,\r\n \"xticklabels_size\": 10,\r\n \"yticklabels_size\": 10,\r\n \"cbarlabels_size\": 10,\r\n}\r\n\r\ntrans_outputs = transition_matrix(\r\n trans_dict=outputs[\"transition_probability\"], show_figs=True, return_df=True, **kwargs\r\n)\r\n\r\nprint(trans_outputs[\"All Subjects\"])\r\n```\r\n**Outputs:**\r\n```\r\nClustering [GROUP: All Subjects]: 100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588| 4/4 [00:00<00:00, 22.42it/s]\r\n2025-02-17 13:34:58,392 neurocaps.analysis.cap [INFO] [GROUP: All Subjects | METHOD: silhouette] Optimal cluster size is 2.\r\n```\r\n\r\n<img src=\"assets/silhouette.png\" width=70% height=70%>\r\n\r\n| Subject_ID | Group | Run | 1.1 | 1.2 | 2.1 | 2.2 |\r\n| --- | --- | --- | --- | --- | --- | --- |\r\n| 0004 | All Subjects | run-continuous | 0.743590 | 0.256410 | 0.523810 | 0.476190 |\r\n| 0006 | All Subjects | run-2 | 0.722222 | 0.277778 | 0.454545 | 0.545455 |\r\n\r\n<img src=\"assets/transprob.png\" width=70% height=70%>\r\n\r\n| From/To | CAP-1 | CAP-2 |\r\n| --- | --- | --- |\r\n| CAP-1 | 0.732906 | 0.267094 |\r\n| CAP-2 | 0.489177 | 0.510823 |\r\n\r\n## Acknowledgements\r\nNeurocaps relies on several popular data processing, machine learning, neuroimaging, and visualization\r\n[packages](https://neurocaps.readthedocs.io/en/stable/#dependencies).\r\n\r\nAdditionally, some foundational concepts in this package take inspiration from features or design patterns implemented\r\nin other neuroimaging Python packages, specically:\r\n\r\n- mtorabi59's [pydfc](https://github.com/neurodatascience/dFC), a toolbox that allows comparisons among several popular\r\ndynamic functionality methods.\r\n- 62442katieb's [idconn](https://github.com/62442katieb/IDConn), a pipeline for assessing individual differences in\r\nresting-state or task-based functional connectivity.\r\n\r\n## Contributing\r\nPlease refer the [contributing guidelines](https://github.com/donishadsmith/neurocaps/blob/test/CONTRIBUTING.md) on how to contribute to neurocaps.\r\n\r\n## References\r\n[^1]: Liu, X., Zhang, N., Chang, C., & Duyn, J. H. (2018). Co-activation patterns in resting-state fMRI signals. NeuroImage, 180, 485\u2013494. https://doi.org/10.1016/j.neuroimage.2018.01.041\r\n\r\n[^2]: Yang, H., Zhang, H., Di, X., Wang, S., Meng, C., Tian, L., & Biswal, B. (2021). Reproducible coactivation patterns of functional brain networks reveal the aberrant dynamic state transition in schizophrenia. NeuroImage, 237, 118193. https://doi.org/10.1016/j.neuroimage.2021.118193\r\n\r\n[^3]: Zhang, R., Yan, W., Manza, P., Shokri-Kojori, E., Demiral, S. B., Schwandt, M., Vines, L., Sotelo, D., Tomasi, D., Giddens, N. T., Wang, G., Diazgranados, N., Momenan, R., & Volkow, N. D. (2023).\r\nDisrupted brain state dynamics in opioid and alcohol use disorder: attenuation by nicotine use. Neuropsychopharmacology, 49(5), 876\u2013884. https://doi.org/10.1038/s41386-023-01750-w\r\n\r\n[^4]: Ingwersen, T., Mayer, C., Petersen, M., Frey, B. M., Fiehler, J., Hanning, U., K\u00fchn, S., Gallinat, J., Twerenbold, R., Gerloff, C., Cheng, B., Thomalla, G., & Schlemm, E. (2024). Functional MRI brain state occupancy in the presence of cerebral small vessel disease \u2014 A pre-registered replication analysis of the Hamburg City Health Study. Imaging Neuroscience, 2, 1\u201317. https://doi.org/10.1162/imag_a_00122\r\n\r\n[^5]: Kupis, L., Romero, C., Dirks, B., Hoang, S., Parlad\u00e9, M. V., Beaumont, A. L., Cardona, S. M., Alessandri, M., Chang, C., Nomi, J. S., & Uddin, L. Q. (2020). Evoked and intrinsic brain network dynamics in children with autism spectrum disorder. NeuroImage: Clinical, 28, 102396. https://doi.org/10.1016/j.nicl.2020.102396\r\n\r\n[^6]: Hyunwoo Gu and Joonwon Lee and Sungje Kim and Jaeseob Lim and Hyang-Jung Lee and Heeseung Lee and Minjin Choe and Dong-Gyu Yoo and Jun Hwan (Joshua) Ryu and Sukbin Lim and Sang-Hun Lee (2024). Discrimination-Estimation Task. OpenNeuro. [Dataset] doi: https://doi.org/10.18112/openneuro.ds005381.v1.0.0\r\n",
"bugtrack_url": null,
"license": "MIT License",
"summary": "Co-activation Patterns (CAPs) Python package",
"version": "0.22.2",
"project_urls": {
"Changelog": "https://neurocaps.readthedocs.io/en/stable/changelog.html",
"Github": "https://github.com/donishadsmith/neurocaps",
"Homepage": "https://neurocaps.readthedocs.io",
"Issues": "https://github.com/donishadsmith/neurocaps/issues"
},
"split_keywords": [
"python",
" co-activation patterns",
" caps",
" neuroimaging",
" fmri",
" dfc",
" dynamic functional connectivity",
" fmriprep"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "835626091ae9169d7c3dc848419b94e6b6cbabe17aad0d80d13105b7ee12d2da",
"md5": "f802a2a44a512e7ecc5b821043b38faa",
"sha256": "8cb90f465e1e4021b9a44d78d97cbc4d90ab548ba76044301e735a4838fccf90"
},
"downloads": -1,
"filename": "neurocaps-0.22.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f802a2a44a512e7ecc5b821043b38faa",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9.0",
"size": 85665,
"upload_time": "2025-02-21T06:55:54",
"upload_time_iso_8601": "2025-02-21T06:55:54.927590Z",
"url": "https://files.pythonhosted.org/packages/83/56/26091ae9169d7c3dc848419b94e6b6cbabe17aad0d80d13105b7ee12d2da/neurocaps-0.22.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "d4698c9d3f42a35159285074ef0e90bf0f74d10a4ff647b09b59a4849273f0d8",
"md5": "6743e58efb9c0e47921d093ba4b134b5",
"sha256": "410fb5ae17114664aa4a2b76b3647472d71b6a13782f1e2576239d2839e9b992"
},
"downloads": -1,
"filename": "neurocaps-0.22.2.tar.gz",
"has_sig": false,
"md5_digest": "6743e58efb9c0e47921d093ba4b134b5",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9.0",
"size": 84995,
"upload_time": "2025-02-21T06:55:56",
"upload_time_iso_8601": "2025-02-21T06:55:56.501115Z",
"url": "https://files.pythonhosted.org/packages/d4/69/8c9d3f42a35159285074ef0e90bf0f74d10a4ff647b09b59a4849273f0d8/neurocaps-0.22.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-21 06:55:56",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "donishadsmith",
"github_project": "neurocaps",
"travis_ci": false,
"coveralls": true,
"github_actions": true,
"lcname": "neurocaps"
}