SEQanswers

Go Back   SEQanswers > General



Similar Threads
Thread Thread Starter Forum Replies Last Post
Costs zhiller RNA Sequencing 5 03-24-2016 04:25 PM
Plotting ChIP-seq read profiles relative to genomic features Gilman85 Bioinformatics 7 12-07-2012 09:45 AM
PE Sequencing or wait 1 year for costs do go down? ca85 Illumina/Solexa 2 12-09-2011 04:39 AM
Reagent costs per run gavin.oliver General 1 02-12-2010 02:40 AM
PeakSeq enables systematic scoring of ChIP-seq experiments relative to controls. merilius Literature Watch 0 01-16-2009 03:21 AM

Reply
 
Thread Tools
Old 12-10-2010, 11:25 AM   #1
pmiguel
Senior Member
 
Location: Purdue University, West Lafayette, Indiana

Join Date: Aug 2008
Posts: 2,315
Exclamation Moore's law vs Sequence Costs: Relative Rates

Chad Nusbaum showed an interesting slide during his tallk at the recent "Sequencing at the Tipping Point" meeting in San Diego. It plotted Moore's law and the sequencing cost against time. Personally, I found the slide particularly apt, because it brought some hard data to something most of us will have noticed. In essense: "my data sets don't cost any more but my computers are too slow to process them!"

I don't have access to Chad's actual slide, but I found basically the same figure here:

http://www.economist.com/node/16349358

(It is about 1/2 way down the page and is entitled "Baseline information")

Until 2005, the slope of the sequencing cost line (semi-log plot of the falling cost of DNA sequencing) roughly matched that of Moore's law. Then the sequencing cost line slopes down, pulling rapidly away from the dawdling Moore's Law slope.

Moore's Law, to over simplify, says you will pay 1/2 as much money for a computer of equal "power" every X years -- where X is somewhere between 1 and 2 years. If you accept my analysis and The Broad's graph then, up until 2005, you could spend roughly the same fraction of money on new computers every time you bought a new sequencer. But after 2005, something happened (presumably second generation sequencers), and you needed to spend more on computation. But my question is: "how much more?"

To extrapolate further from The Economist's cartoon graph, on the page I link to above, I would estimate the "sequence cost halving time" pre-2005 was about 1.4 years. After 2005 it looks to have dropped to about 0.5 years. Meanwhile Moore's law looks like its halving time is plotted as a steady 1.2 years according to the cartoon in question.

Please correct me if I have the math wrong, but as long as these relative rates hold, it seems one needs to double ones expenditures on computation every 0.7 years to keep pace with a static sequencing budget.

Comments?

--
Phillip
pmiguel is offline   Reply With Quote
Old 12-14-2010, 08:36 AM   #2
husamia
Member
 
Location: cinci

Join Date: Apr 2010
Posts: 66
Default

I am not an expert, only sharing my view. there are too many variables. assume a new alginment algorithm or something else that increases performance emerges, wouldn't you see slope increasing even further? seems like you have an over estimate on safe side. Another variable would be major change in the data format that increases efficiency would also drop costs even more. There are many other possible variables that may drastically change in near future that would change the state of this graph.
husamia is offline   Reply With Quote
Old 12-14-2010, 12:16 PM   #3
pmiguel
Senior Member
 
Location: Purdue University, West Lafayette, Indiana

Join Date: Aug 2008
Posts: 2,315
Default

Quote:
Originally Posted by husamia View Post
I am not an expert, only sharing my view. there are too many variables. assume a new alginment algorithm or something else that increases performance emerges, wouldn't you see slope increasing even further? seems like you have an over estimate on safe side. Another variable would be major change in the data format that increases efficiency would also drop costs even more.
I have heard this argument but I think sub-Moore's Law improvement in computation is more likely than the converse. For example, part of seeing the benefit of Moore's Law in recent years involves having code that deploys resources efficiently over multiple cores. If you don't have that built into your code, you don't even see the Moore's Law improvements one expects.

At best, it is my guess, that clever improvements of the type you describe above are necessary just to maintain the status quo of Moore's Law.

Quote:
Originally Posted by husamia View Post
There are many other possible variables that may drastically change in near future that would change the state of this graph.

Yes, I agree with this.

--
Phillip
pmiguel is offline   Reply With Quote
Old 12-15-2010, 11:57 AM   #4
Joann
Senior Member
 
Location: Woodbridge CT

Join Date: Oct 2008
Posts: 231
Default Yet another view

Another version--I call it the sequence tsunami slide--can be viewed on page 4 of Sharon Terry's recent talk at the 4th National Conference on Genomics and Public Health (see link below)

https://www.cmpinc.net/2010PHGConfer...ions/Terry.pdf

No doubt about it, the tsunami of sequence data is just going to keep on piling on, whether or not how fast computers can run.
Joann is offline   Reply With Quote
Old 12-29-2010, 01:49 PM   #5
putnam128
Junior Member
 
Location: Cincinnati

Join Date: Dec 2010
Posts: 2
Default

After a quick review of Moore's Law on Wikipedia, I found this interesting counterpart to Moore's Law: The Great Moore's Law Compensator (TGMLC). Basically, even though computers get faster, successive versions of software continue to gain "bloat" which effectively negate the advances at the hardware level.

I think there could be a counterpart to Moore's Law in genomic sequencing, but I think it would short lived. Like Moore's Law, I think there is a rapidly approaching limit to when the trend will stop for genomic sequencing. Personally, I think when the $1000 genome is reached market forces will start to take over and hold the price of genomic sequencing relatively constant.
putnam128 is offline   Reply With Quote
Old 12-29-2010, 07:24 PM   #6
krobison
Senior Member
 
Location: Boston area

Join Date: Nov 2007
Posts: 747
Default

While the TGMLC certainly exists (a certain outfit in Redmond is an ace at it), that could easily be overstated. Consider the number crunching you can do on a laptop now in R (or BLAST or such) which once required much more time on a very expensive supercomputer. My phone greatly outclasses my favorite server from nearly 2 decades ago (though AFAIK, there isn't yet an Android port for R -- perhaps in 2011?)
krobison is offline   Reply With Quote
Old 01-03-2011, 04:33 AM   #7
pmiguel
Senior Member
 
Location: Purdue University, West Lafayette, Indiana

Join Date: Aug 2008
Posts: 2,315
Default

Quote:
Originally Posted by putnam128 View Post
I think there could be a counterpart to Moore's Law in genomic sequencing, but I think it would short lived. Like Moore's Law, I think there is a rapidly approaching limit to when the trend will stop for genomic sequencing. Personally, I think when the $1000 genome is reached market forces will start to take over and hold the price of genomic sequencing relatively constant.
I am not familiar with a market force that holds "price ... relatively constant". Perhaps you would expand?

I am not an economist but I've see a "drive to commodity" force active that constantly presses prices downward, even when the quality of the cheaper products is lower. I am thinking specifically of computer floppy drives. Before they reached obsolescence there was a period where they were so cheaply made they barely functioned. Information written by one floppy drive could sometimes not be read in another.

--
Phillip
pmiguel is offline   Reply With Quote
Old 01-03-2011, 04:51 AM   #8
Bruins
Member
 
Location: Groningen

Join Date: Feb 2010
Posts: 78
Default

Quote:
Originally Posted by pmiguel View Post
I am not an economist but I've see a "drive to commodity" force active that constantly presses prices downward, even when the quality of the cheaper products is lower. I am thinking specifically of computer floppy drives. Before they reached obsolescence there was a period where they were so cheaply made they barely functioned. Information written by one floppy drive could sometimes not be read in another.
I worry more about the quality of interpretation of the data than that of the data itself.
Bruins is offline   Reply With Quote
Old 01-03-2011, 06:21 AM   #9
putnam128
Junior Member
 
Location: Cincinnati

Join Date: Dec 2010
Posts: 2
Default

What I was referring to is supply vs demand curves. In most markets there is an equilibrium point where the price of item will settle based the cost of manufacturing vs the cost of consumption. For example, if you look at the price of bread. You can go to any grocery store in an area and pick up a loaf a bread for roughly the same price. What you see in computing markets, I believe, is a continually shifting average which has in effect kept computer systems prices fairly unchanging. By this I mean, that every time a new processor line comes out the standard for what is considered an average PC jumps with it. And market forces have, in some respects, set the price of what an average PC should cost.

Thinking back on it now, my earlier statement was meant more as a macro-economic type view on genome sequencing, and probably shouldn't have been made in the context of this thread.
putnam128 is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 10:13 AM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2019, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO