omeClust on Biowulf
Detects clusters of features using omics data and scores metadata (resolution score) based on their influences in clustering. The similarity of features within each cluster can be different (different resolution). Resolution of similarity score takes to account not only similarity between measurements and also the structure in a hierarchical structure of data and number of features which group together.
Documentation
Important Notes
- Module Name: omeClust (see the modules page for more information)
Interactive job
Interactive jobs should be used for debugging, graphics, or applications that cannot be run as batch jobs.
Allocate an interactive session and run the program.
Sample session (user input in bold):
[user@biowulf ~]$ sinteractive --cpus-per-task=2 --mem=4g --gres=lscratch:10 salloc.exe: Pending job allocation 1059698 salloc.exe: job 1059698 queued and waiting for resources salloc.exe: job 1059698 has been allocated resources salloc.exe: Granted job allocation 1059698 salloc.exe: Waiting for resource configuration salloc.exe: Nodes cn0852 are ready for job srun: error: x11: no local DISPLAY defined, skipping [user@cn0852 ~]$ cd /lscratch/$SLURM_JOB_ID [user@cn0852 1059698]$ git clone https://github.com/omicsEye/omeClust.git Cloning into 'omeClust'... remote: Enumerating objects: 389, done. remote: Counting objects: 100% (389/389), done. remote: Compressing objects: 100% (282/282), done. remote: Total 389 (delta 237), reused 246 (delta 99), pack-reused 0 Receiving objects: 100% (389/389), 5.40 MiB | 0 bytes/s, done. Resolving deltas: 100% (237/237), done. [user@cn0852 1059698]$ module load omeClust [+] Loading omeClust 1.1.6 on cn0852 [+] Loading singularity 3.6.4 on cn0852 [user@cn0852 1059698]$ cd omeClust/ [user@cn0852 omeClust]$ omeClust -i data/synthetic/dist_4_0.001_4_200.txt \ --metadata data/synthetic/truth_4_0.001_4_200.txt \ -o omeclust_demo \ --plot /usr/local/lib/python3.6/dist-packages/omeClust/utilities.py:149: RuntimeWarning: invalid value encountered in double_scalars s = (b - a) / max(a, b) The number of major clusters: 3 Ground truth is the most influential metadata in clusters There are 3 clusters Output is written in omeclust_demo [user@cn0852 omeClust]$ ls omeclust_demo/ adist.txt Ground truth_MDS_3D_plot.pdf Ground truth_PCoA_plot.pdf clusters.txt Ground truth_MDS_plot.pdf Ground truth_t-SNE_3D_plot.pdf dendrogram.pdf Ground truth_PCA_3D_plot.pdf Ground truth_t-SNE_plot.pdf discretize_metadata.txt Ground truth_PCA_plot.pdf network_plot.pdf feature_cluster_label.txt Ground truth_PCoA_3D_plot.pdf omeClust_log.txt [user@cn0852 omeClust]$ exit exit salloc.exe: Relinquishing job allocation 1059698 salloc.exe: Job allocation 1059698 has been revoked. [user@biowulf ~]$
Batch job
Most jobs should be run as batch jobs.
Create a batch input file (e.g. omeClust.sh). For example:
#!/bin/bash set -e module load omeClust omeClust \ -i data/synthetic/dist_4_0.001_4_200.txt \ --metadata data/synthetic/truth_4_0.001_4_200.txt \ -o omeclust_demo \ --plot
Submit this job using the Slurm sbatch command.
sbatch [--cpus-per-task=#] [--mem=#] omeClust.sh
Swarm of Jobs
A swarm of jobs is an easy way to submit a set of independent commands requiring identical resources.
Create a swarmfile (e.g. omeClust.swarm). For example:
omeClust -i adist1.txt -o output1 --metadata metadata1.txt --plot omeClust -i adist2.txt -o output2 --metadata metadata2.txt --plot omeClust -i adist3.txt -o output3 --metadata metadata3.txt --plot omeClust -i adist4.txt -o output4 --metadata metadata4.txt --plot
Submit this job using the swarm command.
swarm -f omeClust.swarm [-g #] [-t #] --module omeClustwhere
-g # | Number of Gigabytes of memory required for each process (1 line in the swarm command file) |
-t # | Number of threads/CPUs required for each process (1 line in the swarm command file). |
--module omeClust | Loads the omeClust module for each subjob in the swarm |