Usage Notes
Execution and the BIDS format
The main input to ASLPREP is the path of the dataset that needs processing.
The input dataset is required to be in valid BIDS format, and it must include at least one T1w structural image. We highly recommend that you validate your dataset with the free, online BIDS Validator.
Important
Please note that ASL data in BIDS datasets should already be scaled.
What this means is that the M0 scans in your dataset should preferably be scaled before running ASLPrep.
Please see the BIDS starter kit for information about converting ASL data to BIDS.
If your data are not already scaled, you should use the --m0_scale
parameter when running
ASLPrep.
The exact command to run ASLPrep depends on the Installation method. The common parts of the command follow the BIDS-Apps definition. For example:
aslprep data/bids_root/ out/ participant -w work/
Command-Line Arguments
ASLPrep: ASL PREProcessing workflows v0.6.0
usage: aslprep [-h] [--skip_bids_validation]
[--participant-label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]]
[--bids-filter-file FILE] [-d [PATH ...]]
[--bids-database-dir PATH] [--nprocs NPROCS]
[--omp-nthreads OMP_NTHREADS] [--mem MEMORY_GB] [--low-mem]
[--use-plugin FILE] [--sloppy] [--anat-only]
[--level {minimal,resampling,full}] [--boilerplate-only]
[--reports-only]
[--ignore {fieldmaps,sbref,t2w,flair} [{fieldmaps,sbref,t2w,flair} ...]]
[--output-spaces [OUTPUT_SPACES ...]] [--longitudinal]
[--asl2t1w-init {register,header}] [--asl2t1w-dof {6,9,12}]
[--force-bbr | --force-no-bbr] [--dummy-scans DUMMY_SCANS]
[--random-seed _RANDOM_SEED] [--force-ge | --force-no-ge]
[--m0_scale M0_SCALE] [--smooth_kernel SMOOTH_KERNEL]
[--scorescrub] [--basil] [--medial-surface-nan]
[--project-goodvoxels] [--md-only-boilerplate]
[--cifti-output [{91k,170k}]] [--no-msm]
[--skull-strip-template SKULL_STRIP_TEMPLATE]
[--skull-strip-fixed-seed]
[--skull-strip-t1w {auto,skip,force}] [--fmap-bspline]
[--fmap-no-demean] [--use-syn-sdc] [--force-syn]
[--fs-license-file FILE] [--fs-subjects-dir PATH]
[--no-submm-recon] [--fs-no-reconall] [--version] [-v]
[-w WORK_DIR] [--clean-workdir] [--resource-monitor]
[--config-file FILE] [--write-graph] [--stop-on-first-crash]
[--notrack] [--debug {pdb,all} [{pdb,all} ...]]
bids_dir output_dir {participant}
Positional Arguments
- bids_dir
the root folder of a BIDS valid dataset (sub-XXXXX folders should be found at the top level in this folder).
- output_dir
the output path for the outcomes of preprocessing and visual reports
- analysis_level
Possible choices: participant
processing stage to be run, only “participant” in the case of ASLPREP (see BIDS-Apps specification).
Options for filtering BIDS queries
- --skip_bids_validation, --skip-bids-validation
assume the input dataset is BIDS compliant and skip the validation
- --participant-label, --participant_label
A space delimited list of participant identifiers or a single identifier (the sub- prefix can be removed)
- --bids-filter-file
A JSON file describing custom BIDS input filters using PyBIDS. For further details, please check out https://aslprep.readthedocs.io/en/0.6.0/faq.html#how-do-I-select-only-certain-files-to-be-input-to-ASLPrep
- -d, --derivatives
Search PATH(s) for pre-computed derivatives.
- --bids-database-dir
Path to a PyBIDS database folder, for faster indexing (especially useful for large datasets). Will be created if not present.
Options to handle performance
- --nprocs, --nthreads, --n_cpus, --n-cpus
maximum number of threads across all processes
- --omp-nthreads
maximum number of threads per-process
- --mem, --mem_mb, --mem-mb
upper bound memory limit for ASLPrep processes
- --low-mem
attempt to reduce memory usage (will increase disk usage in working directory)
- --use-plugin, --nipype-plugin-file
nipype plugin configuration file
- --sloppy
Use low-quality tools for speed - TESTING ONLY
Options for performing only a subset of the workflow
- --anat-only
run anatomical workflows only
- --level
Possible choices: minimal, resampling, full
Processing level; may be ‘minimal’ (nothing that can be recomputed), ‘resampling’ (recomputable targets that aid in resampling) or ‘full’ (all target outputs).
- --boilerplate-only, --boilerplate_only
generate boilerplate only
- --reports-only
only generate reports, don’t run workflows. This will only rerun report aggregation, not reportlet generation for specific nodes.
Workflow configuration
- --ignore
Possible choices: fieldmaps, sbref, t2w, flair
ignore selected aspects of the input dataset to disable corresponding parts of the workflow (a space delimited list)
- --output-spaces
Standard and non-standard spaces to resample anatomical and functional images to. Standard spaces may be specified by the form
<SPACE>[:cohort-<label>][:res-<resolution>][...]
, where<SPACE>
is a keyword designating a spatial reference, and may be followed by optional, colon-separated parameters. Non-standard spaces imply specific orientations and sampling grids. Important to note, theres-*
modifier does not define the resolution used for the spatial normalization. To generate no ASL outputs, use this option without specifying any spatial references.- --longitudinal
treat dataset as longitudinal - may increase runtime
- --asl2t1w-init
Possible choices: register, header
Either “register” (the default) to initialize volumes at center or “header” to use the header information when coregistering ASL to T1w images.
- --asl2t1w-dof
Possible choices: 6, 9, 12
Degrees of freedom when registering ASL to T1w images. 6 degrees (rotation and translation) are used by default.
- --force-bbr
Always use boundary-based registration (no goodness-of-fit checks)
- --force-no-bbr
Do not use boundary-based registration (no goodness-of-fit checks)
- --dummy-scans
Number of non-steady-state volumes. Note that this indicates the number of volumes, not the number of control-label pairs in the ASL file.
- --random-seed
Initialize the random seed for the workflow
- --force-ge
Always use boundary-based registration (no goodness-of-fit checks)
- --force-no-ge
Do not use boundary-based registration (no goodness-of-fit checks)
- --m0_scale
Relative scale between ASL and M0. M0 scans are multiplied by m0_scale before calculating CBF. It is important to note, however, that BIDS expects ASL and M0 data to scaled in the raw dataset, so this parameter should only be used if your dataset does not have pre-scaled data.
- --smooth_kernel
Smoothing kernel for the M0 image(s)
- --scorescrub
Apply Sudipto Dolui’s algorithms for denoising CBF
- --basil
FSL’s CBF computation with spatial regularization and partial volume correction
- --project-goodvoxels
Exclude voxels whose timeseries have locally high coefficient of variation from surface resampling. Only performed for GIFTI files mapped to a freesurfer subject (fsaverage or fsnative).
Options for modulating outputs
- --medial-surface-nan
Replace medial wall values with NaNs on functional GIFTI files. Only performed for GIFTI files mapped to a freesurfer subject (fsaverage or fsnative).
- --md-only-boilerplate
Skip generation of HTML and LaTeX formatted citation with pandoc
- --cifti-output
Possible choices: 91k, 170k
Output preprocessed BOLD as a CIFTI dense timeseries. Optionally, the number of grayordinate can be specified (default is 91k, which equates to 2mm resolution)
- --no-msm
Disable Multimodal Surface Matching surface registration.
Specific options for ANTs registrations
- --skull-strip-template
select a template for skull-stripping with antsBrainExtraction
- --skull-strip-fixed-seed
do not use a random seed for skull-stripping - will ensure run-to-run replicability when used with –omp-nthreads 1 and matching –random-seed <int>
- --skull-strip-t1w
Possible choices: auto, skip, force
determiner for T1-weighted skull stripping (‘force’ ensures skull stripping, ‘skip’ ignores skull stripping, and ‘auto’ applies brain extraction based on the outcome of a heuristic to check whether the brain is already masked).
Specific options for handling fieldmaps
- --fmap-bspline
fit a B-Spline field using least-squares (experimental)
- --fmap-no-demean
do not remove median (within mask) from fieldmap
Specific options for SyN distortion correction
- --use-syn-sdc
EXPERIMENTAL: Use fieldmap-free distortion correction
- --force-syn
EXPERIMENTAL/TEMPORARY: Use SyN correction in addition to fieldmap correction, if available
Specific options for FreeSurfer preprocessing
- --fs-license-file
Path to FreeSurfer license key file. Get it (for free) by registering at https://surfer.nmr.mgh.harvard.edu/registration.html
- --fs-subjects-dir
Path to existing FreeSurfer subjects directory to reuse. (default: OUTPUT_DIR/freesurfer)
- --no-submm-recon
Disable sub-millimeter (hires) reconstruction
- --fs-no-reconall
Disable FreeSurfer surface preprocessing.
Other options
- --version
show program’s version number and exit
- -v, --verbose
increases log verbosity for each occurrence, debug level is -vvv
- -w, --work-dir
path where intermediate results should be stored
- --clean-workdir
Clears working directory of contents. Use of this flag is notrecommended when running concurrent processes of aslprep.
- --resource-monitor
enable Nipype’s resource monitoring to keep track of memory and CPU usage
- --config-file
Use pre-generated configuration file. Values in file will be overridden by command-line arguments.
- --write-graph
Write workflow graph.
- --stop-on-first-crash
Force stopping on first crash, even if a work directory was specified.
- --notrack
Opt-out of sending tracking information of this run to the aslprep developers. This information helps to improve aslprep and provides an indicator of real world usage crucial for obtaining funding.
- --debug
Possible choices: pdb, all
Debug mode(s) to enable. ‘all’ is alias for all available modes.
Running ASLPrep via Docker containers
For every new version of ASLPrep that is released, a corresponding Docker image is generated.
In order to run ASLPrep Docker images, the Docker Engine must be installed.
If you have used ASLPrep via Docker in the past, you might need to pull down a more recent version of the image:
$ docker pull pennlinc/aslprep:<latest-version>
ASLPrep can be run interacting directly with the Docker Engine via the docker run
command line, or through a lightweight wrapper that was created for convenience.
Running ASLPrep directly interacting with the Docker Engine
Running containers as a user.
In order to run docker smoothly, it is best to prevent permissions issues associated with the root file system. Running docker as user on the host is to ensure the ownership of files written during the container execution.
ASLPrep requires a significant amount of memory, typicaly around 12GB per subject. If using docker desktop, you can set this in preferences. You can also set it on the command line.
A docker
container can be created using the following command:
$ docker run -ti -m 12GB --rm \
-v path/to/data:/data:ro \
-v path/to/output:/out \
pennlinc/aslprep:<latest-version> \
/data /out/out \
participant
For example:
$ docker run -ti -m 12GB --rm \
-v $HOME/ds000240:/data:ro \
-v $HOME/ds000240-results:/out:rw \
-v $HOME/tmp/ds000240-workdir:/work \
-v ${FREESURFER_HOME}:/fs \
pennlinc/aslprep:<latest-version> \
/data /out/aslprep-<latest-version> \
participant \
--participant-label '01'
--fs-license-file ${FREESURFER_HOME}/license.txt
-w /work
See Usage for more information.
Running ASLPrep via Singularity containers
Preparing a Singularity Image
Singularity version >= 2.5: If the version of Singularity installed on your HPC system is modern enough you can create a Singularity image directly on the system using the following command:
$ singularity build aslprep-<version>.simg docker://pennlinc/aslprep:<version>
where <version>
should be replaced with the desired version of ASLPrep that you
want to download.
Running a Singularity Image
If the data to be preprocessed is also on the HPC or a personal computer, you are ready to run ASLPrep.
$ singularity run --cleanenv aslprep.simg \
path/to/data/dir path/to/output/dir \
participant \
--participant-label label
Handling environment variables
By default, Singularity interacts with all environment variables from the host.
The host libraries could accidentally conflict with singularity variables.
To avoid such a situation, it is recommended that you sue the --cleanenv or -e
flag.
For instance:
$ singularity run --cleanenv aslprep.simg \
/work/789/asdf/ $WORK/output \
participant \
--participant-label 01
Relevant aspects of the $HOME
directory within the container.
By default, Singularity will bind the user’s $HOME
directory on the host
into the /home/$USER
directory (or equivalent) in the container.
Most of the times, it will also redefine the $HOME
environment variable and
update it to point to the corresponding mount point in /home/$USER
.
However, these defaults can be overwritten in your system.
It is recommended that you check your settings with your system’s administrators.
If your Singularity installation allows it, you can work around the $HOME
specification combining the bind mounts argument (-B
) with the home overwrite
argument (--home
) as follows:
$ singularity run -B $HOME:/home/aslprep --home /home/aslprep \
--cleanenv aslprep.simg <aslprep arguments>
The FreeSurfer license
ASLPRep uses FreeSurfer tools, which require a license to run.
To obtain a FreeSurfer license, simply register for free at https://surfer.nmr.mgh.harvard.edu/registration.html.
When using manually-prepared environments or singularity, FreeSurfer will search
for a license key file first using the $FS_LICENSE
environment variable and then
in the default path to the license key file ($FREESURFER_HOME/license.txt
).
If using the --cleanenv
flag and $FS_LICENSE
is set, use --fs-license-file $FS_LICENSE
to pass the license file location to ASLPrep.
It is possible to run the docker container pointing the image to a local path
where a valid license file is stored.
For example, if the license is stored in the $HOME/.licenses/freesurfer/license.txt
file on the host system:
$ docker run -ti --rm \
-v $HOME/fullds005:/data:ro \
-v $HOME/dockerout:/out \
-v $HOME/.licenses/freesurfer/license.txt:/opt/freesurfer/license.txt \
pennlinc/aslprep:latest \
/data /out/out \
participant \
--ignore fieldmaps
Troubleshooting
Logs and crashfiles are written to the
<output dir>/aslprep/sub-<participant_label>/log
directory.
Information on how to customize and understand these files can be found on the
nipype debugging
page.
Support and communication
The documentation of this project is found here: https://aslprep.readthedocs.io.
All bugs, concerns and enhancement requests for this software can be submitted here: https://github.com/PennLINC/aslprep/issues.
If you have a question about using ASLPrep, please create a new topic on NeuroStars with the “Software Support” category and the “aslprep” tag. The ASLPrep developers follow NeuroStars, and will be able to answer your question there.