![]() |
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Bowtie, an ultrafast, memory-efficient, open source short read aligner | Ben Langmead | Bioinformatics | 514 | 03-13-2020 04:57 AM |
Bowtie out of memory | Palgrave | Bioinformatics | 6 | 03-04-2014 07:33 PM |
Bowtie: Ultrafast and memory-efficient alignment of short reads to the human genome | Ben Langmead | Literature Watch | 2 | 03-04-2013 03:06 AM |
bowtie memory warning | crh | Bioinformatics | 7 | 08-20-2012 02:24 AM |
Bowtie memory error | polsum | Bioinformatics | 2 | 12-02-2011 01:17 PM |
![]() |
|
Thread Tools |
![]() |
#1 |
Junior Member
Location: Tübingen Join Date: Jan 2012
Posts: 9
|
![]()
Hi,
perhaps somebody can solve this little mystery for me. Running bowtie --chunkmbs 512 worked without any errors/warnings. As the mapping statistic was not as I expected, among other things, I also increased the value for chunkmbs to 4096. However, on the same dataset, with the other parameters untouched, bowtie now reports "Exhausted best-first chunk memory for read..." millions of times. At this point, the sever was using only 10% of it's overall memory. Could somebody explain this to me please? Chris |
![]() |
![]() |
![]() |
#2 |
Member
Location: New York Join Date: Dec 2010
Posts: 40
|
![]()
Hi ChrisAU,
Did you solve your problem? I am now having the same issue and I could use a little bit of help. Thanks! |
![]() |
![]() |
![]() |
Tags |
bowtie, chunkmbs, memory |
Thread Tools | |
|
|