SEQanswers

SEQanswers (http://seqanswers.com/forums/index.php)
-   The Pipeline (http://seqanswers.com/forums/forumdisplay.php?f=13)
-   -   PacBio reveals (few) commercial specs (http://seqanswers.com/forums/showthread.php?t=3172)

Fred 11-18-2009 11:18 PM

PacBio reveals (few) commercial specs
 
Hi,

PacBio reveals (few) commercial specs

http://www.genomeweb.com/sequencing/pacbio-reveals-commercial-specs-initial-focus-long-reads-short-runs-low-experime


Fred

NextGenSeq 11-19-2009 08:57 AM

The specs don't seem very impressive to me. They are planning to match in the last half of 2010 what current next gen platforms currently generate.

hessec 11-19-2009 12:28 PM

For those of us without premium subscriptions to genomeweb, would anyone care to provide a synopsis or at least some numbers?

Nevermind... you just have to register. I'm an idiot

krobison 11-20-2009 07:20 AM

You can find some commentary on this at my blog, Omics Omics.

I think the dismissive poster is missing that PacBio will, at least initially, be playing for different customers in different spaces, though also bumping up against some of established players (particularly 454). Very long read lengths & $100 as the atomic experiment size are quite different than the existing platforms. Perhaps there will be very few labs which would want a PacBio to the exclusion of the others, but it could well play a role as a complementary machine for a lot of projects.

pmiguel 12-07-2009 05:30 AM

Quote:

Originally Posted by krobison (Post 10694)
You can find some commentary on this at my blog, Omics Omics.

I think the dismissive poster is missing that PacBio will, at least initially, be playing for different customers in different spaces, though also bumping up against some of established players (particularly 454). Very long read lengths & $100 as the atomic experiment size are quite different than the existing platforms. Perhaps there will be very few labs which would want a PacBio to the exclusion of the others, but it could well play a role as a complementary machine for a lot of projects.

Yes, but still... There was the possibility that the commercial launch of PacBio's machine would immediately obsolete all current platforms. It does not look like that will be the case. Just looks like we will be living through hyper-Moore's law decreases in price/base for at least several more years to come.

I do agree that $100 in reagents for an experiment does sound like a big deal. That suggests that total cost would be less than $500 for 50 megabases of data. A similar experiment would cost close to 5-10x that price on a 454 now. But, it may well be that that Roche can close that price gap in the upcoming year.

Informatics, as predicted by anyone who gave the issue much thought, looks more and more like a bottleneck. Longer reads do help with some aspects of informatics--mainly de novo assembly. You just spent your $100-$500 for 50 megabases of sequence. Now how much to convert that sequence into something that enables the user to plan the next experiment?

Depends on the application, of course. Point being that while informatics is a bottleneck, much of its narrowness derives from the the end user's ability (or lack thereof) to understand and utilize the data generated. This issue is compounded by the typical informatician's difficulty in understanding the biology the end user is studying. That is, both ends of this informatics->biology/biology->informatics pipe tend to lack understanding of each others limitations.

It is a big problem. But it is actually the meta-problem that is most dispiriting. That is: Not only do both ends lack understanding of each others limitations, but they also are unaware of this lack of understanding and how detrimental it is to coming up with any solution.

This cannot be the first time divergent disciplines have been forced to merge to deal with a problem. And the meta-discipline: how best to foster such a merger, that would be what? Some sort of process management? Anyone know?

--
Phillip

RockChalkJayhawk 12-25-2009 08:22 AM

PacBio Deployments
 
I know PacBio is being hush hush about it, but does anyone know who the early access people are yet?

avilella 07-20-2010 04:51 AM

More PacBio specs
 
These come straight from the source, Dr. Steven T. Lott gave a talk at the Wellcome Trust Genome Campus (2010-Jul-20).

They have 80K "wells" (not sure the term is right) in each cell, and cells come in long arrays of 8 cells. Up to 12 arrays can be loaded at the same time in a machine, so 96 cells x 80K from the start, but right now only 1/3 of the wells get loaded with a polymerase, and the remaining wells can be filtered out for bioinformatic analysis.
Sample preparation is 4 hours, so one can prepare the sample in the morning and have it running in the afternoon.
There is no amplification step.
The polymerase can go round and round through the DNA element, allowing multiple coverage of the same molecule in the same well.
The machine produces a movie at a rate of 4TB every 30 min, but if one only stores the pulse/trace files (optional), it goes down to 20GB per 30 min, and if only the final calls w/qualities are stored, then 2GB every 30 min.
All secondary analysis software is open-source and will be released soon.
Strobe sequencing: you can design it as you please, he showed real data with 2 fragments and 3 fragments.
The current system sells at ~1000bp read length, so you can do strobe of 4x250bp reaching 6kb of the insert, or 2x500bp reaching 6kb, or whatever the heck you prefer to do.
Why are the polymerases dying? They are being zapped (he said "electrocuted") by the fluorophores. They die and don't do anything from there, no noise from a dead well.
They are now improving the system to do:
Detection of methyl and hydroxymethyl modifications on the DNA, which is done by looking at the pulses at the software level.
Protein syntesis: even allowing for reactions to take place and compare the results.
RNA sequencing with no cDNA required, also soon.
They are internally going from 80K to 320K in each cell, from 30% to >80% in filling the wells with polymerases, read length from 1000bp to 10.000-20.000bp in some wells, with a polymerase upper limit of 70KB they haven't reached yet. Longest insert sizes being used now are 20KB but they aim at 50KB.

Please use the comments to correct any info I've not got right.

GW_OK 07-20-2010 09:03 AM

No mention of error rates?

martinjf 07-28-2010 05:08 AM

It mostly fits what I heard, 80k ZMWs sets but as far as I understood it 2 sets/SMRT cell at commercial launch (whitepaper SMRT DNA sequencing january 2010 from PacBio web site)

"The polymerase can go round and round through the DNA element, allowing multiple coverage of the same molecule in the same well."
this is covered in details in the paper from NAR 2010, 1-8 doi:10.1093/nar/gkq543. It does sound very interresting.

most other aspects you mention are sustained by the Science paper, including error rates at the time they wrote it. It was around 17% in single pass reads but no particular bias was observed as far as I remember (meaning correction through sequencing deepness is possible).

Hopefully out soon (I heard of early 2011 in Europe but it is still apparently not official)

Lee Sam 08-09-2010 09:39 AM

I took a look at their PacBio devnet data - mostly 80ish mers, and about 60k reads. I keep hearing about a pretty high error rate, but I don't have a quick way to quantify that.


All times are GMT -8. The time now is 09:16 AM.

Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2018, vBulletin Solutions, Inc.