SEQanswers

Go Back   SEQanswers > Bioinformatics > Bioinformatics



Similar Threads
Thread Thread Starter Forum Replies Last Post
meta-velvetg cannot detect peak errors plumb Bioinformatics 1 04-05-2012 01:03 PM
unreproducible velvetg results? jeffgao Bioinformatics 5 03-11-2012 02:32 PM
Please help: In Velvetg, maximum elements addressable with 32 bits? chenhao392 Bioinformatics 4 09-23-2011 01:59 AM
How to estimate time needed for velvetg shuang Bioinformatics 0 08-23-2011 07:03 AM
memory issue for cuffdiff slowsmile Bioinformatics 1 06-30-2011 08:39 AM

Reply
 
Thread Tools
Old 04-04-2012, 08:05 AM   #1
nposnien
Member
 
Location: Göttingen, Germany

Join Date: May 2011
Posts: 13
Default Memory issue with velvetg

I'm new to de-novo assembly and hope to get some help with my memory issues. Here is what I'm trying to do:

I want to assemble RNAseq data using velvet/oases. I have one HiSeq2000 lane of 100bp PE reads (approx. 170 M read pairs). This dataset has been randomly splitted into 10%, 20%, 30%....90% subsets. A colleague of mine managed to assemble the 100% dataset with approx. 104 GB RAM. I'm supposed to run the assemblies for the subsets. I managed to run the 10% and 20% samples through velvetg, velveth and oases without problems. The 30% sample i currently in the oases step. However, the 40% subset results in the following error messages:

------
Exited with exit code 1.

Resource usage summary:

CPU time : 27907.75 sec.
Max Memory : 55595 MB
Max Swap : 92723 MB

Max Processes : 4
Max Threads : 52

The output (if any) follows:

velvetg: Can't malloc 540 ShortReadMarkers totalling 10800 bytes: Cannot allocate memory
------

or:

------
Exited with exit code 1.

Resource usage summary:

CPU time : 31115.64 sec.
Max Memory : 40391 MB
Max Swap : 42936 MB

Max Processes : 4
Max Threads : 52

The output (if any) follows:

velvetg: Can't malloc 267 ShortReadMarkers totalling 5340 bytes: Cannot allocate memory
-------

Velveth was run successfully with: ./velveth /Dir 27 -fastq -shortPaired /Dir/X.fastq -short /Dir/Y.fastq
Velvetg settings are: ./velvetg /Dir -ins_length 300 -min_pair_count 2 -read_trkg yes -unused_reads yes

In our AMD-Magny-Cours-Cluster (MEGWARE) Cluster I have 128 GB memory which should be enough (see above). The guys who are responsible for the core facility think that the problem might be that too many small portions of data are produced and that although there is enough memory in the system, it cannot deal with too many small packages (I hope this very unscientific description is clear enough). They suggested the incorporation of Boost (C++) Libraries?
Additional information: The data is quality filtered and trimmed. The short reads are the singletons after trimming/filtering.

I would appreciate any information on this issue.
Thanks a lot in advance!
nposnien is offline   Reply With Quote
Old 01-02-2014, 04:21 PM   #2
RyNkA
Member
 
Location: Brisbane

Join Date: Jul 2013
Posts: 20
Default

hi all,

I did encounter the same issues as this post

21686.720139] 19780000 / 35211509 nodes visited
[21693.361691] 19790000 / 35211509 nodes visited

velvetg: Can't malloc 254367 ShortReadMarkers totalling 2543670 bytes: Cannot allocate memory

-----


CPU time : 05:54:30
Wall time : 06:01:47
Mem usage : 93610144kb
CPU usage : 98%

I did allocate 256gb ram for this job, so there are plenty of memory to spare. But velvet just die unexpectedly..

Any suggestion ?
RyNkA is offline   Reply With Quote
Old 01-03-2014, 03:16 AM   #3
mastal
Senior Member
 
Location: uk

Join Date: Mar 2009
Posts: 667
Default

How many reads do you have and how long are the reads?

There are a few things you could try that would help to use less memory.

What parameters has velvet been compiled with?

The parameters CATEGORIES and MAXKMERLENGTH affect the amount of memory velvet uses. If velvet has been compiled for more categories than you are using or a longer kmer length than you need, then reducing these values would reduce memory usage.
mastal is offline   Reply With Quote
Old 01-03-2014, 10:34 PM   #4
RyNkA
Member
 
Location: Brisbane

Join Date: Jul 2013
Posts: 20
Default

Quote:
Originally Posted by mastal View Post
How many reads do you have and how long are the reads?

There are a few things you could try that would help to use less memory.

What parameters has velvet been compiled with?

The parameters CATEGORIES and MAXKMERLENGTH affect the amount of memory velvet uses. If velvet has been compiled for more categories than you are using or a longer kmer length than you need, then reducing these values would reduce memory usage.
Hi,

That was an Ion Proton reads, with read length ranging from 30-300 (after quality trimmed). I got around 25 million read per lib, 6 lib total.

MAXKMERLENGTH is set at 70, CATEGORIES is 2

Already read some of the opinions online and they suggest that data is so big so that even if the allocated memory is still available, the process cannot be continued.
RyNkA is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 01:32 AM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2018, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO