A toolkit for Dynamic Analysis of Nucleosome and Protein Occupancy by Sequencing.
Features
danpos.py -h
Allocate an interactive session and run the program.
Sample session (user input in bold):
[user@biowulf]$ sinteractive salloc.exe: Pending job allocation 46116226 salloc.exe: job 46116226 queued and waiting for resources salloc.exe: job 46116226 has been allocated resources salloc.exe: Granted job allocation 46116226 salloc.exe: Waiting for resource configuration salloc.exe: Nodes cn3144 are ready for job [user@cn3144 ~]$ module load DANPOS [user@cn3144 ~]$ danpos.py --help danpos version 3.0.0 For help information for each function, try: python danpos.py-h Functions: dpos: analyze each protein-binding position (~100 to ~200bp wide) across the whole genome, e.g. nucleosome positions. dpeak: analyze each protein-binding peak (~1 to ~1kb wide) across the whole genome, e.g. protein that binds accruately to some specific motifs. dregion: analyze each protein-binding region (~1 to ~10kb wide) across the whole genome, e.g. some histone modifications. dtriple: Do analysis at all three levels including each region, peak, and position. Would be useful when little is known about the potential binding pattern. profile: analyze wiggle format occupancy or differential signal profile relative to gene structures or bed format genomic regions. wiq: normalize wiggle files to have the same quantiles (Quantile normalization). wig2wiq: convert wiggle format file to wiq format. stat: some statistics for positions, peaks, or regions. selector: select a subset of positions, peaks, or regions by value ranges or gene structures neighboring. valuesAtRanks: retrieve position, peak, or region values by ranks. Kaifu Chen, et al. chenkaifu@gmail.com, Li lab, Biostatistics department, Dan L. Duncan cancer center, Baylor College of Medicine. [user@cn3144 ~]$ exit salloc.exe: Relinquishing job allocation 46116226 [user@biowulf ~]$
Create a batch input file (e.g. DANPOS.sh). For example:
#!/bin/bash
set -e
module load DANPOS
danpos.py dops sampleA
Submit this job using the Slurm sbatch command.
sbatch --cpus-per-task=2 --mem=2g DANPOS.sh
Create a swarmfile (e.g. DANPOS.swarm). For example:
cd dir1;danpos.py dpos sampleA cd dir2;danpos.py dpos sampleB
Submit this job using the swarm command.
swarm -f DANPOS.swarm [-g #] [-t #] --module DANPOSwhere
-g # | Number of Gigabytes of memory required for each process (1 line in the swarm command file) |
-t # | Number of threads/CPUs required for each process (1 line in the swarm command file). |
--module DANPOS | Loads the DANPOS module for each subjob in the swarm |