SEQanswers

Go Back   SEQanswers > Bioinformatics > Bioinformatics



Similar Threads
Thread Thread Starter Forum Replies Last Post
Time & Cost of using 1 MiSeq Machine to do 16s rDNA (V2/V4) Seq on 300 Samples/Month vs92 Illumina/Solexa 28 10-09-2015 11:07 AM
Problem building SAMtools 1.2 64-bit under Linux Mint 17 dhfx Bioinformatics 2 04-22-2015 07:18 AM
Dream Machine Biotech Rockstar The Pipeline 7 11-02-2012 02:03 PM
do anyone know Blat for 32 bit linux machine ? papori Bioinformatics 5 03-30-2011 01:57 AM
System requirements linux comp. for off-machine assembly/analysis rjsb 454 Pyrosequencing 7 04-21-2009 07:35 AM

Reply
 
Thread Tools
Old 08-24-2017, 04:36 AM   #1
nermze
Junior Member
 
Location: Oslo

Join Date: Aug 2017
Posts: 4
Default Building a linux machine for MiSeq

Given virtually unlimited budget, what type of machine would you recommend for analysis of MiSeq data, and genome assembly? We are mostly working on Tuberculosis.

We are looking at a HP workstation with 128Gb-512Gb of memory, and 20-24 core CPU.

Is this overkill for bacterial DNA? Will double Xeon CPUs work great under Linux? How much storage should we aim for?

Any input is appriciated, thx in advance.
nermze is offline   Reply With Quote
Old 08-24-2017, 05:24 AM   #2
GenoMax
Senior Member
 
Location: East Coast USA

Join Date: Feb 2008
Posts: 7,048
Default

If you have the budget go for the machine described with as much RAM as you can afford. MiSeq runs themselves are not that large (~50G per run, even if you count raw flowcell data). Always get CPU's that are a one or two steps below the top one available (significant cost savings that can be used towards other components like memory/disk). Dual CPU's will work great under linux.
GenoMax is offline   Reply With Quote
Old 08-24-2017, 06:58 AM   #3
nermze
Junior Member
 
Location: Oslo

Join Date: Aug 2017
Posts: 4
Default

Ty so much, much appriciated!

Could I just ask 1 more question? Is there a preffered ratio of CPU Cores vs amount of RAM?
nermze is offline   Reply With Quote
Old 08-24-2017, 07:06 AM   #4
GenoMax
Senior Member
 
Location: East Coast USA

Join Date: Feb 2008
Posts: 7,048
Default

Since you are going to be working with bacterial genomes you may be able to get away with ~3-5G per core. For human genome size data one generally needs ~30G RAM. Since different programs work in different ways it would be hard to come up with a fixed recommendation though.

For de novo assemblies you will likely use SPAdes. Check their manual for recommendations on RAM/hardware from examples they list.
GenoMax is offline   Reply With Quote
Old 08-24-2017, 08:09 AM   #5
nermze
Junior Member
 
Location: Oslo

Join Date: Aug 2017
Posts: 4
Default

I see, thx for the quick reply.

We are starting with bacterial genomes, but are moving on to more complex organisms in the future. We have a Core I9 with 64Gb of RAM atm, and its working fine, but we want to future proof these new machines as much as possible. We will also be performing alot of runs in the start, there will be several other research groups. Am I right to assume, that although not necesarry, lots of cores/ram will generally speed up the process?
nermze is offline   Reply With Quote
Old 08-24-2017, 08:29 AM   #6
GenoMax
Senior Member
 
Location: East Coast USA

Join Date: Feb 2008
Posts: 7,048
Default

Quote:
Originally Posted by nermze View Post
I see, thx for the quick reply.

We are starting with bacterial genomes, but are moving on to more complex organisms in the future. We have a Core I9 with 64Gb of RAM atm, and its working fine, but we want to future proof these new machines as much as possible. We will also be performing alot of runs in the start, there will be several other research groups.
A standalone machine will always have some limitations on how much you could do with it. If there is a need to make the resources available to other research groups simultaneously then you may be better off looking at putting a small compute cluster together. That could require resources (e.g. dedicated server room space, sys admin expertise) that you may have to acquire.
Quote:
Am I right to assume, that although not necesarry, lots of cores/ram will generally speed up the process?
While technically correct that is a bit of oversimplification. At some point you will hit limits elsewhere (e.g. I/O, software) that may make some of those cores idle while they wait for the data to show up.
GenoMax is offline   Reply With Quote
Old 08-24-2017, 01:24 PM   #7
nermze
Junior Member
 
Location: Oslo

Join Date: Aug 2017
Posts: 4
Default

Thank you all for great input, I really appriciate it.
nermze is offline   Reply With Quote
Old 08-28-2017, 05:12 PM   #8
Carcharodon
Member
 
Location: Honolulu, HI

Join Date: Jul 2015
Posts: 40
Default

What do you guys think about AMD's Threadripper 1950X for these kinds of builds? $1,000 for the processor with 16 cores/32 threads, can be overclocked stable at 4.0 GHz? Good value?

Not sure how it performs with Linux.
Carcharodon is offline   Reply With Quote
Old 08-29-2017, 04:16 AM   #9
GenoMax
Senior Member
 
Location: East Coast USA

Join Date: Feb 2008
Posts: 7,048
Default

Threadripper is very new. On paper it does offer a lot for the money and may very well become the go-to CPU for bioinformatics in a few months.

It may be best to wait for some time to build a critical system with it considering some of the microcode issues Ryzen appears to have suffered earlier this year.
GenoMax is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 05:53 PM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2020, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO