View Single Post
Old 10-13-2013, 09:11 AM   #4
gvoisin
Junior Member
 
Location: CORK (IRLANDE)

Join Date: May 2011
Posts: 3
Default

Hi, Personnally, with a I7 intel + 8 Gig in Memory, I'm able to perform analysis for large microarray data.
But, i think that , with the high throughput data like NGS data, you need to server with many nodes + many Gig for the ram. It's not possible to process them with your laptop. the HT data needs High level tools.
Hence, amazon server is one solution, or in your instistution ,you have, may be, a compute "grapp" , composed of many nodes and RAM.

My strategie, I run my code on a server for the greedy step. like algnment .., I save the R object, I load on my personnal computer to go on my analysis. Normally, all analysis could be processed on a server but with the task ordonnancor ( i'm not sure of this terms in english), you have some constraints

When you have a big data, think a big computer. you loss your times to process your data on your computer. Moreover, the time machine is not expensive ( if you compared to the price to generate the HT data )
gvoisin is offline   Reply With Quote