Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • NGS platform: To buy or not to buy?

    This is a question I have been hearing lately. Malaysia has not own any NGS platform yet but it's coming (I know at least 4) and probably will arrive by end of this year. When I ask which platform, they say they're not sure (or they are not sure if they wanna let me know).

    I was told that many of the NGS machines were owned by small labs rather than big genome center. My lab used to have plans to buy a 454 machine too. After doing all the mathematics and economics, we decided not to.

    Yes, it's so much cheaper when you run sequencing on your own. And you can do lotsa things like barcoding your samples up to 4000+ with 6nt barcode. (If you have a software that can sort so many different samples).
    There are limitations if you use the service provided by others (I should know better)

    Just found an article in Genome Technology published in april 08.
    "Is Next-Gen Technology Right for Your lab?"


    Some of the things you might wanna consider before you purchase a machine:
    1) Data
    The data storage will be more expensive than generating the data. Another problem is slow data transfer rate.
    2) Cost
    Extra funding like data storage upgrade, technical support and machine storage environment etc should be taken into consideration.
    They found out that Solexa can perform better in a cold environment.
    3) Labour (Hire someone with experience)
    4) Test run the machine (Optimization could be weeks or months)
    5) How long it takes before the machine is being replaced by a new technology?
    Usually 5 years. Anything beyond that will be called LGS (last generation sequencing) 3rd Generation Sequencing will be here in 2010.

    Then again, please correct me if I'm wrong.

    Cheers

  • #2
    Here is a nice comparison table:
    This website is for sale! politigenomics.com is your first and best source for all of the information you’re looking for. From general topics to more of what you would expect to find here, politigenomics.com has it all. We hope you find what you are searching for!
    --
    bioinfosm

    Comment


    • #3
      Originally posted by Melissa View Post




      Some of the things you might wanna consider before you purchase a machine:
      1) Data
      The data storage will be more expensive than generating the data. Another problem is slow data transfer rate.
      2) Cost
      Extra funding like data storage upgrade, technical support and machine storage environment etc should be taken into consideration.
      They found out that Solexa can perform better in a cold environment.
      3) Labour (Hire someone with experience)
      4) Test run the machine (Optimization could be weeks or months)
      5) How long it takes before the machine is being replaced by a new technology?
      Usually 5 years. Anything beyond that will be called LGS (last generation sequencing) 3rd Generation Sequencing will be here in 2010.

      Then again, please correct me if I'm wrong.

      Cheers
      Ive never understood where the numbers in point 1 come from. In total the cost of computing to run something like a GAII is now about $10K per instrument (it used to be more like $40K). That's around 2-5TB of disk and some reasonable numbers of cores. You can do it even more cheaply if you wish. For example you only need around 1.5TB (for, say 50bp) - as long as you discard raw data immediately - and you only need enough compute power to process the raw data - and do alignments - during the run time of the machine. Run times are currently around 9-10 days for long read pairs (thats a lot). Anyway, if you amortize the cost of the computing over the lifetime of the instrument (lets say 3 years) - with say an average 7day run time - you end up with about $80 per run. Even if you spent 50K on computing per instrument its only $300 per run. In fact you'd have to spend $500K on computing - per instrument - for the amortized cost to be greater than the cost of the reagents and chips.

      So I don't understand where this comes from. $80 on computing for a $4K reagent run seems fine to me. More to the point many people record a 5-15% run failure rate (all reasons), if you amortize that over the life-time of the instrument it is costing you FAR more than the computing (about 6X more in fact). So it makes more sense to worry about run failure causes.

      Maybe it comes from the idea that you need to keep all the data, and at around $500 per terabyte that ends up being comparable to the run costs in terms of reagents and chips ? well, you dont need to keep raw data beyond your chosen QC period; and the data that is left boils down to a few 10's gigabytes - so again around $50 to store. Will that data even be kept for 3 years ?


      Data transfer - another myth. Data transfer is as fast as your network - so gigabit I'd hope. If not please upgrade it - this will cost you a few $K at most. Gigabit means around 70-120 Megabytes/s aggregate bandwidth. A system like a GAII produces anywhere up to 4 Megabytes per second of raw data (+/- 4Mbytes - lots of factors). Anyway, a typical network has plenty of bandwidth to run 1 system - and in principle quite a number. I think this myth arises because may people wait till the end of a run before moving data to where they want to process and look at it - so you add copying time to the end of the run. The solution is obvious and adopted in many places - write the data where you want it over the network in the first place - or even better and more reliable - mirror it in *real time* to where you want it. Thus the entire data set is off the instrument and on your disk within a few minutes of a run finishing. If you do this I do not see how data transfer can be a problem as you can begin the next run immediately. Some labs even process the data as they collect it (but this is tricky to implement yourself - i believe the iPar attempts something similar too).

      Sanger does something very like this on ~40 GAs and has produced over 4 (6-8 raw) Terabases of data so far. (I think that compute comes out at about $40K per instrument although it is used also for other heavy compute jobs).

      The other points seem very sensible and good advice.
      Last edited by cgb; 12-31-2008, 10:30 AM.

      Comment


      • #4
        Ipar

        Originally posted by cgb View Post
        If you do this I do not see how data transfer can be a problem as you can begin the next run immediately. Some labs even process the data as they collect it (but this is tricky to implement yourself - i believe the iPar attempts something similar too).
        Ipar performs image analysis on "realtime". When the run finishes you have the full image analysis already done. What's call Firecrest in the illumina/GA pipeline.

        That's a huge improvement both in disk storage and cpu cycles. If you want (and you should) you can toss the images and start your analysis from Bustard/basecalling.

        Ipar will perform basecalling very soon too.
        -drd

        Comment

        Latest Articles

        Collapse

        • seqadmin
          Strategies for Sequencing Challenging Samples
          by seqadmin


          Despite advancements in sequencing platforms and related sample preparation technologies, certain sample types continue to present significant challenges that can compromise sequencing results. Pedro Echave, Senior Manager of the Global Business Segment at Revvity, explained that the success of a sequencing experiment ultimately depends on the amount and integrity of the nucleic acid template (RNA or DNA) obtained from a sample. “The better the quality of the nucleic acid isolated...
          03-22-2024, 06:39 AM
        • seqadmin
          Techniques and Challenges in Conservation Genomics
          by seqadmin



          The field of conservation genomics centers on applying genomics technologies in support of conservation efforts and the preservation of biodiversity. This article features interviews with two researchers who showcase their innovative work and highlight the current state and future of conservation genomics.

          Avian Conservation
          Matthew DeSaix, a recent doctoral graduate from Kristen Ruegg’s lab at The University of Colorado, shared that most of his research...
          03-08-2024, 10:41 AM

        ad_right_rmr

        Collapse

        News

        Collapse

        Topics Statistics Last Post
        Started by seqadmin, Yesterday, 06:37 PM
        0 responses
        8 views
        0 likes
        Last Post seqadmin  
        Started by seqadmin, Yesterday, 06:07 PM
        0 responses
        8 views
        0 likes
        Last Post seqadmin  
        Started by seqadmin, 03-22-2024, 10:03 AM
        0 responses
        49 views
        0 likes
        Last Post seqadmin  
        Started by seqadmin, 03-21-2024, 07:32 AM
        0 responses
        66 views
        0 likes
        Last Post seqadmin  
        Working...
        X