Biowulf High Performance Computing at the NIH
MAGGIE on Biowulf

MAGGIE identifies sequence motifs that regulate transcription factor binding and function. According to the authors:

MAGGIE provides a framework for identifying DNA sequence motifs mediating transcription factor binding and function. By leveraging measurements and genetic variation information from different genotypes (human individuals, animal strains, or alleles), MAGGIE associates the mutation of DNA sequence motif with various types of epigenomic features, including but not limited to transcription factor binding, open chromatin, histone modification, and stimulus response of regulatory elements.

References:

Documentation
Important Notes

Interactive job
Interactive jobs should be used for debugging, graphics, or applications that cannot be run as batch jobs.

Allocate an interactive session and run the program.
Sample session (user input in bold):

[user@biowulf]$ sinteractive --cpus-per-task=8 --gres=lscratch:10
salloc.exe: Pending job allocation 46116226
salloc.exe: job 46116226 queued and waiting for resources
salloc.exe: job 46116226 has been allocated resources
salloc.exe: Granted job allocation 46116226
salloc.exe: Waiting for resource configuration
salloc.exe: Nodes cn3144 are ready for job

[user@cn3144 ~]$ module load MAGGIE

[user@cn3144 ~]$ cd /lscratch/$SLURM_JOB_ID

[user@cn3144 ~]$ cp $MAGGIE_EXAMPLE_DATA/QTL/DNase/*.fa .

[user@cn3144 ~]$ maggie_fasta_input.py dsQTL_high.fa dsQTL_low.fa -p $SLURM_CPUS_PER_TASK
Running MAGGIE on 1013 motifs for 5668 sequences with 8 parallel process
100%|###############################################################| 1013/1013 [38:54<00:00,  1.77s/it]
Successfully saved distribution plots
Successfully saved motif logos
Results are ready in ./maggie_output/

[user@cn3144 ~]$ cp maggie_output /data/$USER/

[user@cn3144 ~]$ exit
salloc.exe: Relinquishing job allocation 46116226
[user@biowulf ~]$

Batch job
Most jobs should be run as batch jobs.

Create a batch input file (e.g. MAGGIE.sh). For example:

#!/bin/bash
set -e
module load MAGGIE
cd /lscratch/$SLURM_JOB_ID
cp $MAGGIE_EXAMPLE_DATA/QTL/DNase/*.fa .
maggie_fasta_input.py dsQTL_high.fa dsQTL_low.fa -p $SLURM_CPUS_PER_TASK
cp -r maggie_output /data/$USER/

Submit this job using the Slurm sbatch command.

sbatch [--cpus-per-task=#] [--mem=#] MAGGIE.sh
Swarm of Jobs
A swarm of jobs is an easy way to submit a set of independent commands requiring identical resources.

Create a swarmfile (e.g. MAGGIE.swarm). For example:

maggie_fasta_input.py sample1.fa -p $SLURM_CPUS_PER_TASK -o outputdir1/
maggie_fasta_input.py sample2.fa -p $SLURM_CPUS_PER_TASK -o outputdir2/
maggie_fasta_input.py sample3.fa -p $SLURM_CPUS_PER_TASK -o outputdir3/
maggie_fasta_input.py sample4.fa -p $SLURM_CPUS_PER_TASK -o outputdir4/

Submit this job using the swarm command.

swarm -f MAGGIE.swarm [-g #] [-t #] --module MAGGIE
where
-g # Number of Gigabytes of memory required for each process (1 line in the swarm command file)
-t # Number of threads/CPUs required for each process (1 line in the swarm command file).
--module MAGGIE Loads the MAGGIE module for each subjob in the swarm