SEQanswers

Go Back   SEQanswers > Bioinformatics > Bioinformatics



Similar Threads
Thread Thread Starter Forum Replies Last Post
MPI and multi-threading in trans-abyss gwilymh Bioinformatics 1 09-18-2015 01:42 PM
memory Problem With Blat mido1951 Bioinformatics 7 06-18-2015 01:20 AM
HiSeq X Ten Restrictions Will Remain For At Least Another Year AllSeq Illumina/Solexa 0 04-24-2014 08:39 AM
RNA-Seq: R-SAP: a multi-threading computational pipeline for the characterization of Newsbot! Literature Watch 0 01-31-2012 03:00 AM
novoalign multi-threading or para-processing? qqcandy Bioinformatics 7 09-28-2010 04:33 PM

Reply
 
Thread Tools
Old 09-17-2015, 03:05 PM   #1
gwilymh
Member
 
Location: Milwaukee

Join Date: Dec 2011
Posts: 72
Default Problem with SSPACE, multi-threading and memory restrictions

Hi,
I am attempting to use SSPACE version 3 to re-scaffold some mRNA contigs using paired-end RNA-Seq data. While SSPACe was designed for rescaffolding genomes, I see no reason why it should not work in similar fashion for re-scaffolding de novo assembled RNA contigs.

SSPACE is discussed in seqanswers here:
http://seqanswers.com/forums/showthread.php?t=8350

The SSPACE webpage can be reached from here:
http://www.baseclear.com/genomics/bi...setools/SSPACE

I consistently get the following error message:
=>date: Mapping reads to contigs with Bowtie
Thread 8 terminated abnormally: Can't open bwa output -- fatal
Out of memory!
Process 'extend/format contigs' failed on date
with:
resources_used.mem=47,424,992kb
resources_used.vmem=64,093,240kb
resources_used.walltime=00:34:33
It seems to me that SSPACE is running out of physical memory while trying to open the bwa output.

The "Thread 8 terminated abnormally" is especially cryptic, as the program was set to only run with 6 threads (-T 6).

My run parameters are:
Code:
#PBS -l nodes=1:ppn=6,mem=47gb,walltime=72:00:00
SSPACE_FILE=${HOME}/src/SSPACE-STANDARD-3.0_linux-x86_64/SSPACE_Standard_v3.0.pl 
LIBRARY_FILE=/filepath/library_file_2.txt
CONTIG_FILE=/filepath/A_planci_pcg_transdec_MePath2Renam_echinoHomology.fasta		#-s option, contigs that we are scaffolding
MIN_LINKS=10		   	# -k 10
THREADS=6			    # -T, threads
SKIP=0				    #-S 0=no, -S 1=yes, skip processing of reads
EXTEND_CONTIGS=1		#-x, extend contigs using sequence data, 0=no, 1=yes, default 0
VERBOSE=1			    #-v  Runs the scaffolding process in verbose mode (-v 1=yes, -v 0=no, default -v 0, optional)
$SSPACE_FILE -l $LIBRARY_FILE -s $CONTIG_FILE -k $MIN_LINKS -T $THREADS -S $SKIP -x $EXTEND_CONTIGS -v $VERBOSE
The supporting library file is attached.

I previously tried to run SSPACE over multiple nodes, but it seemed to use the cores on only one of the nodes.

Does anyone know of a way to get SSPACE to run across multiple nodes? Can it be run with mpi multi-threading? If not, does anyone have any suggestions for reducing memory requirements?
Attached Files
File Type: txt library_file_2.txt (339 Bytes, 16 views)

Last edited by GenoMax; 10-06-2015 at 07:50 AM. Reason: added CODE tags to improve display
gwilymh is offline   Reply With Quote
Old 10-06-2015, 07:40 AM   #2
AnnabelleVH
Junior Member
 
Location: France

Join Date: Oct 2015
Posts: 1
Default

Hi gwilymh,

Did you solve your problem?
I encounter the same error, i would love to beneficit from your help if you found a solution.

cheers,
AnnabelleVH is offline   Reply With Quote
Old 11-24-2015, 01:02 PM   #3
sunnycqcn
Member
 
Location: Canada

Join Date: Apr 2013
Posts: 17
Default Did you solve your problem?

Quote:
Originally Posted by gwilymh View Post
Hi,
I am attempting to use SSPACE version 3 to re-scaffold some mRNA contigs using paired-end RNA-Seq data. While SSPACe was designed for rescaffolding genomes, I see no reason why it should not work in similar fashion for re-scaffolding de novo assembled RNA contigs.

SSPACE is discussed in seqanswers here:
http://seqanswers.com/forums/showthread.php?t=8350

The SSPACE webpage can be reached from here:
http://www.baseclear.com/genomics/bi...setools/SSPACE

I consistently get the following error message:
=>date: Mapping reads to contigs with Bowtie
Thread 8 terminated abnormally: Can't open bwa output -- fatal
Out of memory!
Process 'extend/format contigs' failed on date
with:
resources_used.mem=47,424,992kb
resources_used.vmem=64,093,240kb
resources_used.walltime=00:34:33
It seems to me that SSPACE is running out of physical memory while trying to open the bwa output.

The "Thread 8 terminated abnormally" is especially cryptic, as the program was set to only run with 6 threads (-T 6).

My run parameters are:
Code:
#PBS -l nodes=1:ppn=6,mem=47gb,walltime=72:00:00
SSPACE_FILE=${HOME}/src/SSPACE-STANDARD-3.0_linux-x86_64/SSPACE_Standard_v3.0.pl 
LIBRARY_FILE=/filepath/library_file_2.txt
CONTIG_FILE=/filepath/A_planci_pcg_transdec_MePath2Renam_echinoHomology.fasta		#-s option, contigs that we are scaffolding
MIN_LINKS=10		   	# -k 10
THREADS=6			    # -T, threads
SKIP=0				    #-S 0=no, -S 1=yes, skip processing of reads
EXTEND_CONTIGS=1		#-x, extend contigs using sequence data, 0=no, 1=yes, default 0
VERBOSE=1			    #-v  Runs the scaffolding process in verbose mode (-v 1=yes, -v 0=no, default -v 0, optional)
$SSPACE_FILE -l $LIBRARY_FILE -s $CONTIG_FILE -k $MIN_LINKS -T $THREADS -S $SKIP -x $EXTEND_CONTIGS -v $VERBOSE
The supporting library file is attached.

I previously tried to run SSPACE over multiple nodes, but it seemed to use the cores on only one of the nodes.

Does anyone know of a way to get SSPACE to run across multiple nodes? Can it be run with mpi multi-threading? If not, does anyone have any suggestions for reducing memory requirements?
I met the same problem. Did you solve your problem?
If you have, could you share your experiences?
Thanks
sunnycqcn is offline   Reply With Quote
Reply

Tags
multi-threading, parallel, rnaseq, scaffolding, sspace

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 04:03 AM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2021, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO