Filtering and trimming for long-read sequencing data (PacBio/ONT). Filtering is done on average read quality and minimal or maximal read length, and applying a headcrop (start of read) and tailcrop (end of read) while printing the reads passing the filter.
Allocate an interactive session and run the program.
Sample session (user input in bold):
[user@biowulf]$ sinteractive salloc.exe: Pending job allocation 46116226 salloc.exe: job 46116226 queued and waiting for resources salloc.exe: job 46116226 has been allocated resources salloc.exe: Granted job allocation 46116226 salloc.exe: Waiting for resource configuration salloc.exe: Nodes cn3144 are ready for job [user@cn3144 ~]$ module load chopper [user@cn3144 ~]$ chopper --threads 2 -q 10 -l 500 < $CHOPPER_HOME/test/test.fastq | gzip > filtered.fastq.gz Kept 205 reads out of 250 reads [user@cn3144 ~]$ exit salloc.exe: Relinquishing job allocation 46116226 [user@biowulf ~]$
Create a batch input file (e.g. chopper.sh). For example:
#!/bin/bash set -e module load chopper chopper -q 10 -l 500 < $CHOPPER_HOME/test/test.fastq | gzip > filtered.fastq.gz
Submit this job using the Slurm sbatch command.
sbatch [--cpus-per-task=#] [--mem=#] chopper.sh
Create a swarmfile (e.g. chopper.swarm). For example:
gunzip -c sample1.fq.gz | chopper -q 10 -l 500 | gzip > sample1.filtered.fq.gz gunzip -c sample2.fq.gz | chopper -q 10 -l 500 | gzip > sample2.filtered.fq.gz gunzip -c sample3.fq.gz | chopper -q 10 -l 500 | gzip > sample3.filtered.fq.gz gunzip -c sample4.fq.gz | chopper -q 10 -l 500 | gzip > sample4.filtered.fq.gz
Submit this job using the swarm command.
swarm -f chopper.swarm [-g #] [-t #] --module chopperwhere
-g # | Number of Gigabytes of memory required for each process (1 line in the swarm command file) |
-t # | Number of threads/CPUs required for each process (1 line in the swarm command file). |
--module chopper | Loads the chopper module for each subjob in the swarm |