SEQanswers

Go Back   SEQanswers > Sequencing Technologies/Companies > Ion Torrent

Similar Threads
Thread Thread Starter Forum Replies Last Post
Illumina fragment length distribution delphi_ote Genomic Resequencing 3 05-18-2012 01:59 AM
read length limits sample prep time? zhengz Ion Torrent 0 02-26-2011 11:36 AM
Fragment length Illumina CatVincent Introductions 0 11-23-2010 04:29 AM
maximal fragment length GA? remcoloos Illumina/Solexa 1 04-14-2010 03:20 PM
Maximum fragment length Jenny Russ Illumina/Solexa 5 09-30-2008 03:55 PM

Reply
 
Thread Tools
Old 05-23-2011, 12:23 PM   #1
krobison
Senior Member
 
Location: Boston area

Join Date: Nov 2007
Posts: 747
Default Fragment length limits?

I've been told that with the Ion Fragment Kit, the maximum input fragment size should be 150 bp so that with adapters the fragment is no longer than 210 bp.

Can anyone with access to Ion's deepest, darkest secret documents (known elsewhere as "package inserts") confirm this? The application note on Amplicon Sequencing mentions nothing on the subject.

Anyone know if the constraint is imposed in the library prep or the emPCR template prep? What imposes it? Extension times? Nucleotide concentrations? emPCR micelle sizes?

One might guess this is something Ion will need to address in order to hit 454-like read lengths
krobison is offline   Reply With Quote
Old 05-23-2011, 12:48 PM   #2
Marzanna
Junior Member
 
Location: Europe

Join Date: Mar 2008
Posts: 7
Default

We are still waiting for installation so I can only share theoretical knowledge from the secret documents ;-)

The fragmentation is done with BioRuptor or Covaris (both described).
"If the size distribution of the fragmented DNA is in the 150–200 bp range, you can use alternative fragmentation methods such as the Covaris System."

After end repair and adapter ligation there is size selection step (eg. Pippin Prep).
"For optimal sequencing results, size-select a DNA library with a mean size of 185–210 bp, with a size distribution of ±20 bp around the mean. Libraries with a mean size >~220 bp yield results of reduced sequencing quality."

I was also told that too wide insert size range (like size selection on beads for SOLiD) leads to too high percentage of short reads (50-100bp).

And now I will be taken directly to the IonHell
see you around
Marzanna
Marzanna is offline   Reply With Quote
Old 05-23-2011, 02:36 PM   #3
The_Dark_Base
Junior Member
 
Location: Cambridge, MA

Join Date: Aug 2010
Posts: 6
Default

The small insert size really voids the use of the PGM for their often touted library validation application. Looking at the tech note, you have a library with adapters, then add an oligo on to attach to their particle. With all that only adapter dimers would work on the PGM?

There must be a size limitation due to how much DNA can be put onto their ionspheres, i.e. longer templates have less on the ionsphere, produces less H+

Just look at binding on other Life Technology spheres:

"The amount of biotinylated DNA immobilized will depend on fragment size. Due to steric hindrance, the binding efficiency is significantly reduced when the fragment size exceeds..."

http://tools.invitrogen.com/content/.../high/3612.jpg
The_Dark_Base is offline   Reply With Quote
Old 05-23-2011, 06:53 PM   #4
SeqAA
Guest
 

Posts: n/a
Default

Quote:
Originally Posted by The_Dark_Base View Post
There must be a size limitation due to how much DNA can be put onto their ionspheres, i.e. longer templates have less on the ionsphere, produces less H+
this would be my guess as well. High copies per sphere until their algorithms get better?
  Reply With Quote
Old 05-23-2011, 07:01 PM   #5
ECO
--Site Admin--
 
Location: SF Bay Area, CA, USA

Join Date: Oct 2007
Posts: 1,290
Default

Quote:
Originally Posted by SeqAA View Post
this would be my guess as well. High copies per sphere until their algorithms get better?
Also sensor/chemistry improvement perhaps? Algo can't detect protons that aren't there from a low level of template.
ECO is offline   Reply With Quote
Old 05-23-2011, 10:51 PM   #6
zhengz
Member
 
Location: sweden

Join Date: Aug 2010
Posts: 21
Default

Quote:
Originally Posted by SeqAA View Post
this would be my guess as well. High copies per sphere until their algorithms get better?
As Ion prioritizes quick turnaround time, amplicon length has been sacrificed. The advertisement says read length is going to be double in 201x. It has been quite long time since 454 advertised 1000 bp, but still not there yet. Read length is the hard part for emPCR/bridge-PCR based technologies.
zhengz is offline   Reply With Quote
Old 05-31-2011, 07:07 AM   #7
Claudia Stewart
Junior Member
 
Location: Frederick MD

Join Date: Sep 2008
Posts: 4
Default

Quote:
Originally Posted by krobison View Post
I've been told that with the Ion Fragment Kit, the maximum input fragment size should be 150 bp so that with adapters the fragment is no longer than 210 bp.

Can anyone with access to Ion's deepest, darkest secret documents (known elsewhere as "package inserts") confirm this? The application note on Amplicon Sequencing mentions nothing on the subject.

Anyone know if the constraint is imposed in the library prep or the emPCR template prep? What imposes it? Extension times? Nucleotide concentrations? emPCR micelle sizes?

One might guess this is something Ion will need to address in order to hit 454-like read lengths
Apparently, the issue is the micelle size – material larger than 250bp doesn’t drive back to bead well. I suspect that this will no longer be the case with the new One Touch, since there is no oil in the system. At this point, that seems to be the only way to get to longer fragment lengths.
Claudia Stewart is offline   Reply With Quote
Old 08-18-2011, 09:14 PM   #8
BBoy
Member
 
Location: Pacific Northwest

Join Date: Oct 2010
Posts: 47
Default

Quote:
Originally Posted by Claudia Stewart View Post
Apparently, the issue is the micelle size – material larger than 250bp doesn’t drive back to bead well. I suspect that this will no longer be the case with the new One Touch, since there is no oil in the system. At this point, that seems to be the only way to get to longer fragment lengths.
Does de-phasing impose any limits on their read lengths?
BBoy is offline   Reply With Quote
Old 08-28-2011, 06:08 AM   #9
krobison
Senior Member
 
Location: Boston area

Join Date: Nov 2007
Posts: 747
Default

Quote:
Originally Posted by BBoy View Post
Does de-phasing impose any limits on their read lengths?
Yes. De-phasing is the ultimate bugbear for any clonal sequencing approach.

Note that since my original query Ion has released a much longer dataset, though that protocol won't be public until sometime this fall.
krobison is offline   Reply With Quote
Old 08-28-2011, 10:28 AM   #10
BBoy
Member
 
Location: Pacific Northwest

Join Date: Oct 2010
Posts: 47
Default

But that does not seem to be the limiter at this time, since they are driving read length up without changes to the chemistry or chip. Or am I missing something? It seems that longer-term read length can be helped by more sensitive readout what would enable smaller beads with fewer strands on them. Assuming that the probability for dephasing is sqrt(N) where N is the number of strands on a bead, things should get better linearly with bead size, no?
BBoy is offline   Reply With Quote
Old 08-31-2011, 05:51 AM   #11
krobison
Senior Member
 
Location: Boston area

Join Date: Nov 2007
Posts: 747
Default

I'd have to really think through the dephasing, and probably model it.

Probability that any given molecule will dephase in a given cycle is the 1-repetitive yield (I'll call this Pdephase). If you are going to model this, one option is to assume that dephased reads are lost forever. Alternatively, one can assume they can be rephased under the right circumstances (indeed, Ion claims their flow order encourages recovering dephased molecules).

I'm not convinced that fewer molecules leads to less dephasing until you get to very small numbers. The current beads carry 800K molecules per bead; even with very high repetitive yield you are going to lose some in each round and presumably that is close to a fixed proportion. Clearly, if you get to molecule numbers approach or go below 1/Pdephase then you need to model it as a stochastic process. I'm not sure what Pdephase is, but presumably you could crudely estimate it from the distribution of read lengths.

If you get to really small numbers (but not 1), then a possible issue is that the relative contribution of a dephased molecule in the population becomes large -- so it is rare that a bead has any dephased molecules but a small number of dephased molecules kills your signal-to-noise ratio.
krobison is offline   Reply With Quote
Old 08-31-2011, 01:43 PM   #12
lek2k
Member
 
Location: Australia

Join Date: Aug 2011
Posts: 32
Default

I'm going to go off memory on this one so please correct me if I'm wrong.

1. dephased reads may be lost for ever but the contributions they make exist. This is evident in a ionogram as the zeromer calls look a little noisy and still exist deep into the read after rephasing.

2. According to the guys in Ion Community, the flow order can lead to dramatic improvements. However, this was only a simulation. I have found it very hard to find in the code where they have taken advantage of the 32 flow redundant cycle. Anyways, I suspect the long read data set is a combination of chemistry, flow order and other flow parameters being tweaked. There was also a change in version of ion-Analysis that hasn't been released. If you look at the summary report PDF the signal incorporation plot looks very, very odd.

3. Fewer molecules means weaker signal, which means good luck trying to differentiate between a zero-mer and one-mer when things start dephasing. Also I agree, if you have a fewer number then if a few dephase that would be a large proportion of the total making it difficult to rephase and base call. Then again, there could be a biochemical law that says there is a relationship between strands and polymerase and reagents?

4. For each chip sequencing run, the carry forward, incomplete extension and droop fitted probabilities are listed in one of the files for each CAFIE region (13x13 wells).
lek2k is offline   Reply With Quote
Old 09-02-2011, 12:45 PM   #13
krobison
Senior Member
 
Location: Boston area

Join Date: Nov 2007
Posts: 747
Default

Sorry, by lost I meant they no longer contribute signal but do contribute noise.

The flow order improvement wouldn't show up in the code. The idea is that if you follow base X with another X after a few other bases, you can catch up some reads. For example, imagine the flow order

1 2 3
A T A

Incompletely extended molecules after cycle 1 will be finished off in cycle 3 and resynchronized in round 3, unless of course the next correct base is T in which case the majority of molecules on the bead will have marched ahead.
krobison is offline   Reply With Quote
Old 12-06-2011, 04:09 PM   #14
Junior Member
 
Location: Hobart, Australia

Join Date: May 2009
Posts: 5
Default

Has anyone tried the 200 bp kit yet? Using the 100 bp kit on variable size amplicons we see a strong bias against the longer (~200) bp amplicons so that amplicons around 150 bp are over-represented by proportion. This will cause us many problems in analysing environmentally-derived amplicons. Does anyone know how life tech plan to get around this for longer read lengths?
simon.jarman@aad.gov.au is offline   Reply With Quote
Old 12-07-2011, 12:52 PM   #15
arolfe
Member
 
Location: 02119

Join Date: Jul 2011
Posts: 29
Default

We've used the 200bp kit. It seems to work as advertised and increases the maximum construct size (adapters + insert) from about 230bp to about 330bp. I don't have any data about whether the dropoff in sequencing efficiency is as steep at the top of the size range as it was for the 100bp protocol.

The only problem is that there's no OneTouch 200bp kit yet, so you have to do the ePCR by hand for the moment.
arolfe is offline   Reply With Quote
Old 12-08-2011, 09:20 AM   #16
ningwang
Junior Member
 
Location: Boston

Join Date: Sep 2011
Posts: 6
Default

Quote:
Originally Posted by krobison View Post
I've been told that with the Ion Fragment Kit, the maximum input fragment size should be 150 bp so that with adapters the fragment is no longer than 210 bp.
I thought the average read length depended on whether you're using the 314 or the 316 chip.
ningwang is offline   Reply With Quote
Old 12-08-2011, 09:26 AM   #17
ningwang
Junior Member
 
Location: Boston

Join Date: Sep 2011
Posts: 6
Default

Quote:
Originally Posted by BBoy View Post
Does de-phasing impose any limits on their read lengths?
This is my guess as well. If you look at the supplementary data submitted by Rothberg in his Nature article (from July?), the raw data gets pretty ugly after 60-80bps and requires some heavy phase-correction to clean up.
ningwang is offline   Reply With Quote
Old 12-08-2011, 09:54 AM   #18
ningwang
Junior Member
 
Location: Boston

Join Date: Sep 2011
Posts: 6
Default

Quote:
Originally Posted by krobison View Post
I'd have to really think through the dephasing, and probably model it.

Probability that any given molecule will dephase in a given cycle is the 1-repetitive yield (I'll call this Pdephase). If you are going to model this, one option is to assume that dephased reads are lost forever. Alternatively, one can assume they can be rephased under the right circumstances (indeed, Ion claims their flow order encourages recovering dephased molecules).

I'm not convinced that fewer molecules leads to less dephasing until you get to very small numbers. The current beads carry 800K molecules per bead; even with very high repetitive yield you are going to lose some in each round and presumably that is close to a fixed proportion. Clearly, if you get to molecule numbers approach or go below 1/Pdephase then you need to model it as a stochastic process. I'm not sure what Pdephase is, but presumably you could crudely estimate it from the distribution of read lengths.

If you get to really small numbers (but not 1), then a possible issue is that the relative contribution of a dephased molecule in the population becomes large -- so it is rare that a bead has any dephased molecules but a small number of dephased molecules kills your signal-to-noise ratio.
First of all, I don't know if Pdephase is dependent on how many molecules you have per bead. We know that PGM uses 2um diameter beads, which gives you a surface area of 12.5um2. If they're loading 800k molecules per bead, you've got roughly 15-16nm2 per molecule. If that dense enough for one polymerase to impede with another? If so, I suppose having a lower density might reduce dephasing. But if 15-16nm2 per molecule is ample room, having fewer molecules/bead won't change your dephased ratio.

Plus if you go with fewer molecules, at what point are you pushing the limits of the ion sensors at the bottom of the well? Also in the scenario you described (small # of molecules), you might mask dephasing problems but you'll increase your indel error rate.
ningwang is offline   Reply With Quote
Old 12-08-2011, 10:49 AM   #19
krobison
Senior Member
 
Location: Boston area

Join Date: Nov 2007
Posts: 747
Default

Quote:
Originally Posted by ningwang View Post
I thought the average read length depended on whether you're using the 314 or the 316 chip.
This premise of this thread is a bit outdated, as the protocols have made several major changes since I started it.

I believe read length is independent of the chip (to first approximation); it is the newer protocols for preparing the template & better base calling software that are enabling longer reads.
krobison is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 12:10 AM.


Powered by vBulletin® Version 3.8.6
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.