BIDS app using deepMReye to decode eye motion for fMRI time series data.
To be used on preprocessed BIDS derivatives (like those from fMRIprep), to predict eye movements from the bold data when no eye movement data are available.
By default it uses the deepMReye pre-trained "fixation" model.
The first part of the pipeline can however be used to extract data, irrespective of the presence of eye movement labels and could thus be usefull to share anonimysed subject data to the deepMReye dev team to allow them to improve their pre-trained models.
Clone this repository.
git clone git://github.com/cpp-lln-lab/bidsmreyeThen install it.
cd bidsMReye
pip install .For Linux or MacOS you use make to run all the different steps of the demo.
make demoFor Windows you will have to download the data and the pre-trained weights manually.
-
data
- URL: https://osf.io/vufjs/download
- destination folder: tests/data/moae_fmriprep
-
model:
- URL: https://osf.io/download/cqf74/
- destination file: models/dataset1_guided_fixations.h5
├── models
│ └── dataset1_guided_fixations.h5
└── tests
└── data
└── moae_fmriprep
├── logs
└── sub-01
├── anat
├── figures
└── funcRunning the different steps of the demo:
bids_dir="$PWD/tests/data/moae_fmriprep "
output_dir="$PWD/outputs "
bidsmreye --space MNI152NLin6Asym \
--task auditory \
--action prepare \
$bids_dir \
$output_dir
bidsmreye --space MNI152NLin6Asym \
--task auditory \
--action combine \
$bids_dir \
$output_dir
bidsmreye --space MNI152NLin6Asym \
--task auditory \
--action generalize \
--model guided_fixations \
$bids_dir \
$output_dir