bidsmreye


Namebidsmreye JSON
Version 0.3.1 PyPI version JSON
download
home_pagehttps://github.com/cpp-lln-lab/bidsMReye
Summarybids app using deepMReye to decode eye motion for fMRI time series data
upload_time2023-01-26 22:02:42
maintainerRemi Gau
docs_urlNone
authorRemi Gau
requires_python<3.11,>=3.8.0
licenseLGPL-3.0
keywords bids brain imaging data structure neuroimaging automated pipeline mri eyetracking machine learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            [![System tests](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/system_tests.yml/badge.svg?branch=main)](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/system_tests.yml)
[![Test and coverage](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/test_and_coverage.yml/badge.svg)](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/test_and_coverage.yml)
[![codecov](https://codecov.io/gh/cpp-lln-lab/bidsMReye/branch/main/graph/badge.svg?token=G5fm2kaloM)](https://codecov.io/gh/cpp-lln-lab/bidsMReye)
[![Documentation Status](https://readthedocs.org/projects/bidsmreye/badge/?version=latest)](https://bidsmreye.readthedocs.io/en/latest/?badge=latest)
[![License](https://img.shields.io/badge/license-GPL3-blue.svg)](./LICENSE)
[![PyPI version](https://badge.fury.io/py/bidsmreye.svg)](https://badge.fury.io/py/bidsmreye)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/bidsmreye)
![https://github.com/psf/black](https://img.shields.io/badge/code%20style-black-000000.svg)
[![Sourcery](https://img.shields.io/badge/Sourcery-enabled-brightgreen)](https://sourcery.ai)
[![All Contributors](https://img.shields.io/badge/all_contributors-2-orange.svg)](#contributors)
[![paper doi](https://img.shields.io/badge/paper-10.1038%2Fs41593--021--00947--w-blue)](https://doi.org/10.1038/s41593-021-00947-w)
[![zenodo DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.7493322.svg)](https://doi.org/10.5281/zenodo.7493322)



# bidsMReye

BIDS app for decoding gaze position from the eyeball MR-signal using
[deepMReye](https://github.com/DeepMReye/DeepMReye)
([1](https://doi.org/10.1038/s41593-021-00947-w)).

To be used on preprocessed BIDS derivatives (e.g.
[fMRIprep](https://github.com/nipreps/fmriprep) outputs).
No eye-tracking data required.

By default, bidsMReye uses a [pre-trained version](https://osf.io/mrhk9/) of
[deepMReye](https://github.com/DeepMReye/DeepMReye) trained on 5 datasets incl.
guided fixations ([2](https://doi.org/10.1038/sdata.2017.181)), smooth pursuit
([3](https://doi.org/10.1016/j.neuroimage.2018.04.012),[4](https://doi.org/10.1101/2021.08.03.454928),[5](https://doi.org/10.1038/s41593-017-0050-8))
and free viewing ([6](https://doi.org/10.1038/s41593-017-0049-1)). Other
pretrained versions are optional. Dedicated model training is recommended.

The pipeline automatically extracts the eyeball voxels.
This can be used also for other multivariate pattern
analyses in the absence of eye-tracking data.
Decoded gaze positions allow computing eye movements.

Some basic quality control and outliers detection is also performed:

- for each run

![](https://github.com/cpp-lln-lab/bidsMReye/blob/main/docs/source/images/sub-01_task-auditory_space-MNI152NLin6Asym_desc-bidsmreye_eyetrack.png)


- at the group level

![](https://github.com/cpp-lln-lab/bidsMReye/blob/main/docs/source/images/group_eyetrack.png)

For more information, see the
[User Recommendations](https://deepmreye.slite.com/p/channel/MUgmvViEbaATSrqt3susLZ/notes/kKdOXmLqe).
If you have other questions, please reach out to the developer team.

## Install

Better to use the docker image as there are known install issues
of deepmreye on Apple M1 for example.

### Docker

#### Build

```bash
docker build --tag cpplab/bidsmreye:latest --file docker/Dockerfile .
```

#### Pull

Pull the latest docker image:

```bash
docker pull cpplab/bidsmreye:latest
```

### Python package

You can also get the package from pypi if you want.

```bash
pip install bidsmreye
```

#### Conda installation

**NOT TESTED YET**

To encapsulate bidsMReye in a virtual environment install with the following commands:

```bash
conda create --name bidsmreye python=3.10
conda activate bidsmreye
conda install pip
pip install bidsmreye
```

The tensorflow dependency supports both CPU and GPU instructions.

Note that you might need to install cudnn first

```bash
conda install -c conda-forge cudnn
```

#### ANTsPy installation issues

If installation of [ANTsPy](https://github.com/ANTsX/ANTsPy) fails try to manually install it via:

<!-- may help on windows ? -->

```bash
git clone https://github.com/ANTsX/ANTsPy
cd ANTsPy
pip install CMake
python3 setup.py install
```

### Dev install

Clone this repository.

```bash
git clone git://github.com/cpp-lln-lab/bidsmreye
```

Then install the package:

```bash
cd bidsMReye
make install_dev
```

## Usage

## Requirements

bidsmreye requires your input fmri data:

 - to be minimally preprocessed (at least realigned),
 - with filenames and structure that conforms to a BIDS derivative dataset.

Two bids apps are available to generate those types of preprocessed data:

- [fmriprep](https://fmriprep.org/en/stable/)
- [bidspm](https://bidspm.readthedocs.io/en/latest/general_information.html)

Obviousvly your fmri data must include the eyes of your participant for bidsmreye to work.

<!-- old fmriprep versions may not work -->

### CLI

Type the following for more information:

```bash
bidsmreye --help
```

## Preparing the data

`--action prepapre` means that bidsmreye will extract the data coming from the
eyes from the fMRI images.

If your data is not in MNI space, bidsmreye will also register the data to MNI.

```bash
bidsmreye --action prepare \
          bids_dir \
          output_dir \
          participant
```

## Computing the eye movements

`--action generalize` use the extracted timeseries to predict the eye movements
using the default pre-trained model of deepmreye.

This will also generate a quality control report of the decoded eye movements.

```bash
bidsmreye --action generalize \
          bids_dir \
          output_dir \
          participant
```
## Doing it all at once

`--action all` does "prepare" then "generalize".

```bash
bidsmreye --action all \
          bids_dir \
          output_dir \
          participant
```

## Group level summary

bidsmreye --action qc \
          bids_dir \
          output_dir \
          group

## Demo

Please look up the [documentation](https://bidsmreye.readthedocs.io/en/latest/demo.html)

## Contributors ✨

Thanks goes to these wonderful people
([emoji key](https://allcontributors.org/docs/en/emoji-key)):

<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
<!-- prettier-ignore-start -->
<!-- markdownlint-disable -->
<table>
  <tr>
    <td align="center"><a href="https://weexee.github.io/Portfolio/"><img src="https://avatars.githubusercontent.com/u/91776803?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Pauline Cabee</b></sub></a><br /><a href="https://github.com/cpp-lln-lab/bidsMReye/commits?author=WeeXee" title="Code">💻</a> <a href="#ideas-WeeXee" title="Ideas, Planning, & Feedback">🤔</a> <a href="#infra-WeeXee" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
    <td align="center"><a href="https://remi-gau.github.io/"><img src="https://avatars.githubusercontent.com/u/6961185?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Remi Gau</b></sub></a><br /><a href="https://github.com/cpp-lln-lab/bidsMReye/commits?author=Remi-Gau" title="Code">💻</a> <a href="#ideas-Remi-Gau" title="Ideas, Planning, & Feedback">🤔</a> <a href="https://github.com/cpp-lln-lab/bidsMReye/commits?author=Remi-Gau" title="Tests">⚠️</a> <a href="#maintenance-Remi-Gau" title="Maintenance">🚧</a></td>
  </tr>
</table>

<!-- markdownlint-restore -->
<!-- prettier-ignore-end -->

<!-- ALL-CONTRIBUTORS-LIST:END -->

This project follows the
[all-contributors](https://github.com/all-contributors/all-contributors)
specification. Contributions of any kind welcome!

If you train [deepMReye](https://github.com/DeepMReye/DeepMReye), or if you have
eye-tracking training labels and the extracted eyeball voxels, consider sharing
it to contribute to the [pretrained model pool](https://osf.io/mrhk9/).

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/cpp-lln-lab/bidsMReye",
    "name": "bidsmreye",
    "maintainer": "Remi Gau",
    "docs_url": null,
    "requires_python": "<3.11,>=3.8.0",
    "maintainer_email": "remi.gau@gmail.com",
    "keywords": "BIDS,brain imaging data structure,neuroimaging,automated pipeline,MRI,Eyetracking,Machine learning",
    "author": "Remi Gau",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/c5/ed/d5f93e1fe886b1ffb99b7ffababc5fa4c833e3970697b8df5d86e4b2b0b8/bidsmreye-0.3.1.tar.gz",
    "platform": "OS Independent # adapt given that untested on windows and problematic on Mac M1",
    "description": "[![System tests](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/system_tests.yml/badge.svg?branch=main)](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/system_tests.yml)\n[![Test and coverage](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/test_and_coverage.yml/badge.svg)](https://github.com/cpp-lln-lab/bidsMReye/actions/workflows/test_and_coverage.yml)\n[![codecov](https://codecov.io/gh/cpp-lln-lab/bidsMReye/branch/main/graph/badge.svg?token=G5fm2kaloM)](https://codecov.io/gh/cpp-lln-lab/bidsMReye)\n[![Documentation Status](https://readthedocs.org/projects/bidsmreye/badge/?version=latest)](https://bidsmreye.readthedocs.io/en/latest/?badge=latest)\n[![License](https://img.shields.io/badge/license-GPL3-blue.svg)](./LICENSE)\n[![PyPI version](https://badge.fury.io/py/bidsmreye.svg)](https://badge.fury.io/py/bidsmreye)\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/bidsmreye)\n![https://github.com/psf/black](https://img.shields.io/badge/code%20style-black-000000.svg)\n[![Sourcery](https://img.shields.io/badge/Sourcery-enabled-brightgreen)](https://sourcery.ai)\n[![All Contributors](https://img.shields.io/badge/all_contributors-2-orange.svg)](#contributors)\n[![paper doi](https://img.shields.io/badge/paper-10.1038%2Fs41593--021--00947--w-blue)](https://doi.org/10.1038/s41593-021-00947-w)\n[![zenodo DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.7493322.svg)](https://doi.org/10.5281/zenodo.7493322)\n\n\n\n# bidsMReye\n\nBIDS app for decoding gaze position from the eyeball MR-signal using\n[deepMReye](https://github.com/DeepMReye/DeepMReye)\n([1](https://doi.org/10.1038/s41593-021-00947-w)).\n\nTo be used on preprocessed BIDS derivatives (e.g.\n[fMRIprep](https://github.com/nipreps/fmriprep) outputs).\nNo eye-tracking data required.\n\nBy default, bidsMReye uses a [pre-trained version](https://osf.io/mrhk9/) of\n[deepMReye](https://github.com/DeepMReye/DeepMReye) trained on 5 datasets incl.\nguided fixations ([2](https://doi.org/10.1038/sdata.2017.181)), smooth pursuit\n([3](https://doi.org/10.1016/j.neuroimage.2018.04.012),[4](https://doi.org/10.1101/2021.08.03.454928),[5](https://doi.org/10.1038/s41593-017-0050-8))\nand free viewing ([6](https://doi.org/10.1038/s41593-017-0049-1)). Other\npretrained versions are optional. Dedicated model training is recommended.\n\nThe pipeline automatically extracts the eyeball voxels.\nThis can be used also for other multivariate pattern\nanalyses in the absence of eye-tracking data.\nDecoded gaze positions allow computing eye movements.\n\nSome basic quality control and outliers detection is also performed:\n\n- for each run\n\n![](https://github.com/cpp-lln-lab/bidsMReye/blob/main/docs/source/images/sub-01_task-auditory_space-MNI152NLin6Asym_desc-bidsmreye_eyetrack.png)\n\n\n- at the group level\n\n![](https://github.com/cpp-lln-lab/bidsMReye/blob/main/docs/source/images/group_eyetrack.png)\n\nFor more information, see the\n[User Recommendations](https://deepmreye.slite.com/p/channel/MUgmvViEbaATSrqt3susLZ/notes/kKdOXmLqe).\nIf you have other questions, please reach out to the developer team.\n\n## Install\n\nBetter to use the docker image as there are known install issues\nof deepmreye on Apple M1 for example.\n\n### Docker\n\n#### Build\n\n```bash\ndocker build --tag cpplab/bidsmreye:latest --file docker/Dockerfile .\n```\n\n#### Pull\n\nPull the latest docker image:\n\n```bash\ndocker pull cpplab/bidsmreye:latest\n```\n\n### Python package\n\nYou can also get the package from pypi if you want.\n\n```bash\npip install bidsmreye\n```\n\n#### Conda installation\n\n**NOT TESTED YET**\n\nTo encapsulate bidsMReye in a virtual environment install with the following commands:\n\n```bash\nconda create --name bidsmreye python=3.10\nconda activate bidsmreye\nconda install pip\npip install bidsmreye\n```\n\nThe tensorflow dependency supports both CPU and GPU instructions.\n\nNote that you might need to install cudnn first\n\n```bash\nconda install -c conda-forge cudnn\n```\n\n#### ANTsPy installation issues\n\nIf installation of [ANTsPy](https://github.com/ANTsX/ANTsPy) fails try to manually install it via:\n\n<!-- may help on windows ? -->\n\n```bash\ngit clone https://github.com/ANTsX/ANTsPy\ncd ANTsPy\npip install CMake\npython3 setup.py install\n```\n\n### Dev install\n\nClone this repository.\n\n```bash\ngit clone git://github.com/cpp-lln-lab/bidsmreye\n```\n\nThen install the package:\n\n```bash\ncd bidsMReye\nmake install_dev\n```\n\n## Usage\n\n## Requirements\n\nbidsmreye requires your input fmri data:\n\n - to be minimally preprocessed (at least realigned),\n - with filenames and structure that conforms to a BIDS derivative dataset.\n\nTwo bids apps are available to generate those types of preprocessed data:\n\n- [fmriprep](https://fmriprep.org/en/stable/)\n- [bidspm](https://bidspm.readthedocs.io/en/latest/general_information.html)\n\nObviousvly your fmri data must include the eyes of your participant for bidsmreye to work.\n\n<!-- old fmriprep versions may not work -->\n\n### CLI\n\nType the following for more information:\n\n```bash\nbidsmreye --help\n```\n\n## Preparing the data\n\n`--action prepapre` means that bidsmreye will extract the data coming from the\neyes from the fMRI images.\n\nIf your data is not in MNI space, bidsmreye will also register the data to MNI.\n\n```bash\nbidsmreye --action prepare \\\n          bids_dir \\\n          output_dir \\\n          participant\n```\n\n## Computing the eye movements\n\n`--action generalize` use the extracted timeseries to predict the eye movements\nusing the default pre-trained model of deepmreye.\n\nThis will also generate a quality control report of the decoded eye movements.\n\n```bash\nbidsmreye --action generalize \\\n          bids_dir \\\n          output_dir \\\n          participant\n```\n## Doing it all at once\n\n`--action all` does \"prepare\" then \"generalize\".\n\n```bash\nbidsmreye --action all \\\n          bids_dir \\\n          output_dir \\\n          participant\n```\n\n## Group level summary\n\nbidsmreye --action qc \\\n          bids_dir \\\n          output_dir \\\n          group\n\n## Demo\n\nPlease look up the [documentation](https://bidsmreye.readthedocs.io/en/latest/demo.html)\n\n## Contributors \u2728\n\nThanks goes to these wonderful people\n([emoji key](https://allcontributors.org/docs/en/emoji-key)):\n\n<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->\n<!-- prettier-ignore-start -->\n<!-- markdownlint-disable -->\n<table>\n  <tr>\n    <td align=\"center\"><a href=\"https://weexee.github.io/Portfolio/\"><img src=\"https://avatars.githubusercontent.com/u/91776803?v=4?s=100\" width=\"100px;\" alt=\"\"/><br /><sub><b>Pauline Cabee</b></sub></a><br /><a href=\"https://github.com/cpp-lln-lab/bidsMReye/commits?author=WeeXee\" title=\"Code\">\ud83d\udcbb</a> <a href=\"#ideas-WeeXee\" title=\"Ideas, Planning, & Feedback\">\ud83e\udd14</a> <a href=\"#infra-WeeXee\" title=\"Infrastructure (Hosting, Build-Tools, etc)\">\ud83d\ude87</a></td>\n    <td align=\"center\"><a href=\"https://remi-gau.github.io/\"><img src=\"https://avatars.githubusercontent.com/u/6961185?v=4?s=100\" width=\"100px;\" alt=\"\"/><br /><sub><b>Remi Gau</b></sub></a><br /><a href=\"https://github.com/cpp-lln-lab/bidsMReye/commits?author=Remi-Gau\" title=\"Code\">\ud83d\udcbb</a> <a href=\"#ideas-Remi-Gau\" title=\"Ideas, Planning, & Feedback\">\ud83e\udd14</a> <a href=\"https://github.com/cpp-lln-lab/bidsMReye/commits?author=Remi-Gau\" title=\"Tests\">\u26a0\ufe0f</a> <a href=\"#maintenance-Remi-Gau\" title=\"Maintenance\">\ud83d\udea7</a></td>\n  </tr>\n</table>\n\n<!-- markdownlint-restore -->\n<!-- prettier-ignore-end -->\n\n<!-- ALL-CONTRIBUTORS-LIST:END -->\n\nThis project follows the\n[all-contributors](https://github.com/all-contributors/all-contributors)\nspecification. Contributions of any kind welcome!\n\nIf you train [deepMReye](https://github.com/DeepMReye/DeepMReye), or if you have\neye-tracking training labels and the extracted eyeball voxels, consider sharing\nit to contribute to the [pretrained model pool](https://osf.io/mrhk9/).\n",
    "bugtrack_url": null,
    "license": "LGPL-3.0",
    "summary": "bids app using deepMReye to decode eye motion for fMRI time series data",
    "version": "0.3.1",
    "split_keywords": [
        "bids",
        "brain imaging data structure",
        "neuroimaging",
        "automated pipeline",
        "mri",
        "eyetracking",
        "machine learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "bb7aed241d11c818f3b91aa74c559fb620e365a80a75426461ce86578d028b4f",
                "md5": "d8efc811300cb821325a579e1f66c912",
                "sha256": "c1005bdd4c56e348f312539ed2f92a919160f45e1ec58e6140b9ea60b7394425"
            },
            "downloads": -1,
            "filename": "bidsmreye-0.3.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d8efc811300cb821325a579e1f66c912",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.11,>=3.8.0",
            "size": 48673,
            "upload_time": "2023-01-26T22:02:41",
            "upload_time_iso_8601": "2023-01-26T22:02:41.104106Z",
            "url": "https://files.pythonhosted.org/packages/bb/7a/ed241d11c818f3b91aa74c559fb620e365a80a75426461ce86578d028b4f/bidsmreye-0.3.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c5edd5f93e1fe886b1ffb99b7ffababc5fa4c833e3970697b8df5d86e4b2b0b8",
                "md5": "1b7a7898c8bc641c5881bd6f03e3c9ad",
                "sha256": "edef2aab491605f2a0f172f8e3a64117d86cfc750bc99c47e9346779cf04d952"
            },
            "downloads": -1,
            "filename": "bidsmreye-0.3.1.tar.gz",
            "has_sig": false,
            "md5_digest": "1b7a7898c8bc641c5881bd6f03e3c9ad",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.11,>=3.8.0",
            "size": 60231,
            "upload_time": "2023-01-26T22:02:42",
            "upload_time_iso_8601": "2023-01-26T22:02:42.797150Z",
            "url": "https://files.pythonhosted.org/packages/c5/ed/d5f93e1fe886b1ffb99b7ffababc5fa4c833e3970697b8df5d86e4b2b0b8/bidsmreye-0.3.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-01-26 22:02:42",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "cpp-lln-lab",
    "github_project": "bidsMReye",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "circle": true,
    "lcname": "bidsmreye"
}
        
Elapsed time: 0.03238s