Using SLURM: This Time It's Personal
I thought it might be helpful to include a little extra stuff on SLURM about how to replicate scripts for each subject. When I’m preprocessing a brain scan each subject is processed separately but I don’t want to have to type out each script individually. So I usually put together the first script (in this case it’s saved as BE_0.sh
) and it looks something like this…
#!/bin/bash
#SBATCH --time=3:00:00
#SBATCH --ntasks=1
#SBATCH --nodes=1
#SBATCH --mem-per-cpu=8192M
#SBATCH -o /fslhome/username/logfiles/oasis/output_BE_0.txt
#SBATCH -e /fslhome/username/logfiles/oasis/error_BE_0.txt
#SBATCH -J "BE_0"
#SBATCH --mail-user=myemail@email.com
#SBATCH --mail-type=BEGIN
#SBATCH --mail-type=END
#SBATCH --mail-type=FAIL
export PBS_NODEFILE=`/fslapps/fslutils/generate_pbs_nodefile`
export PBS_JOBID=$SLURM_JOB_ID
export PBS_O_WORKDIR="$SLURM_SUBMIT_DIR"
export PBS_QUEUE=batch
export OMP_NUM_THREADS=$SLURM_CPUS_ON_NODE
ROOT=/fslhome/username/
DATASET=${ROOT}/compute/dataset/
TEMPLATE1=${ROOT}/compute/templates/MICCAI2012-Multi-Atlas-Challenge-Data/
export ANTSPATH=/fslhome/username/bin/antsbin/bin/
PATH=${ANTSPATH}:${PATH}
IFS=$'\n'
array=( $(find ${DATASET}/*/ -type d -name t1) )
for i in 0;
do
SUB=$(dirname ${array[$i]})
antsBrainExtraction.sh -d 3 \
-a ${SUB}/t1/n4_resliced.nii.gz \
-e ${TEMPLATE1}/T_template0.nii.gz \
-m ${TEMPLATE1}/T_template0_BrainCerebellumProbabilityMask.nii.gz \
-f ${TEMPLATE1}/T_template0_BrainCerebellumRegistrationMask.nii.gz \
-o ${SUB}/t1/
done
The first important thing you should see is how the array
variable is set. It finds all of the directories in the /fslhome/username/compute/dataset/
pathway that include the t1
folder. This is how I create a variable to represents every subject folder I want to perform this script on. Because for i in 0
the for loop only performs a single iteration on the zeroeth component of the array
variable. The take away is that this performs my script on the very first subject in the dataset.
But how will I ever do this on all of the subjects? Well, I’m glad you asked (you are asking that right?). This is where the sed
command comes into play. This is run in terminal to replicate the BE_0.sh
file for every subject.
ROOT=/fslhome/username/
IFS=$'\n'
array=( $(find ${ROOT}/compute/dataset/*/ -type d -name t1) )
echo ${#array[@]}
max=$((${#array[@]}-2))
for i in $(seq 0 $max);
do
num=$i # num = 0
new=$(($num+1)) # new = 1
sed -e 7s/BE_$num/BE_$new/g \ # replace BE_0 (of output_BE_0.txt) on line 7 with BE_1
-e 8s/BE_$num/BE_$new/g \ # replace BE_0 (of error_BE_0.txt) on line 8 with BE_1
-e 9s/BE_$num/BE_$new/g \ # replace BE_0 on line 9 (the Job name)
-e 30s/$num/$new/g \ # replace the 0 (of "for i in 0") with 1
${ROOT}/scripts/dataset/BE/BE_$num.sh > ${ROOT}/scripts/dataset/BE/BE_$new.sh # create a file named "BE_1.sh" with these changes
done
max=$((${#array[@]}-1))
for i in $(seq 0 $max);
do
echo "sbatch ${ROOT}/scripts/dataset/BE/BE_$i.sh ;" >> ${ROOT}/scripts/dataset/run_BE.sh
done
sh ${ROOT}/scripts/dataset/run_BE.sh
You first see that lovely array
variable just as we did in the script. echo
is mostly just here as a sanity check to make sure you used the find
command correctly and your array
variable actually contains anything. I’ve commented on the first iteration of the for loop following the echo
command. This for loop continues after each iteration by adding 1 to num
, creating a new script for each iteration until reaching the total amount of desired scripts minus 1 (indexing in bash starts at zero instead of one, so subject one is actually subject zero). The resulting scripts represent the index for the array
variable by iteratively increasing from for i in 0
, for i in 1
, for i in 2
until all of the array
variable’s indices are represented.
The last for loop just copies the name of each script preceeded by sbatch
into a single file. This conveniently allows a single execution of the run_BE.sh
script to call each of the sbatch
scripts individually without you having to lift a finger.
I apologize if that was a little complicated or poorly explained. Feel free to post questions if you’d like me to clarify anything.
Comments
Post a Comment