SEQanswers

Go Back   SEQanswers > Applications Forums > Sample Prep / Library Generation



Similar Threads
Thread Thread Starter Forum Replies Last Post
qPCR to validate/quantify Nextera library mikmz Sample Prep / Library Generation 0 12-14-2011 12:26 AM
Use Capture Library in Human all exon 50MB kit and other Reagents 38MB kit? shinne Sample Prep / Library Generation 2 09-25-2011 07:00 PM
qPCR's utility in nextgen library construction eca Core Facilities 7 06-30-2011 05:52 AM
Correct kit for qPCR on Illumina preps? Heisman Sample Prep / Library Generation 2 06-08-2011 07:34 AM
Homemade illumina oligos and library prep kit elly Illumina/Solexa 3 03-24-2009 08:05 AM

Reply
 
Thread Tools
Old 03-05-2016, 06:20 AM   #21
DNA_Dan
Member
 
Location: Montana

Join Date: Nov 2008
Posts: 21
Default

Old thread but same issues with qPCR. I am surprised that no one is jumping on the ddPCR bandwagon. I guess it just has it's own issues and isn't that popular. Anyone with experience care to comment? Desperately looking to ditch qPCR.
DNA_Dan is offline   Reply With Quote
Old 03-05-2016, 07:21 AM   #22
cement_head
Senior Member
 
Location: Oxford, Ohio

Join Date: Mar 2012
Posts: 253
Default

Quote:
Originally Posted by DNA_Dan View Post
Old thread but same issues with qPCR. I am surprised that no one is jumping on the ddPCR bandwagon. I guess it just has it's own issues and isn't that popular. Anyone with experience care to comment? Desperately looking to ditch qPCR.
Yep, qPCR can be a real headache. I guess the major issue with ddPCR is the instrumentation cost - $90,000 for a decent system. Once that price comes down, then it'll see more adoption. Just my 2 cents.
cement_head is offline   Reply With Quote
Old 03-07-2016, 09:18 PM   #23
austinso
Member
 
Location: Bay area

Join Date: Jun 2012
Posts: 77
Default

Quote:
Originally Posted by cement_head View Post
Yep, qPCR can be a real headache. I guess the major issue with ddPCR is the instrumentation cost - $90,000 for a decent system. Once that price comes down, then it'll see more adoption. Just my 2 cents.
Yes. That is a huge barrier. That being said, cost per sample will be less than KAPA, since you only need to dilute and measure the sample at 1e-6 to get number of molecules. No standards required. You would load 3.3 billion molecules to get 1000K/mm2.

The caveat is that if you have too much unligated adaptor and/or primer carrying over through your purifications it convolutes the measurement and leads to an overestimate. Still you can more readily quant and correct than qPCR.

If your libraries are fairly clean, Qubit is actually good enough.

The real advantage is when you need to multiplex across samples per run.

Best

Austin
austinso is offline   Reply With Quote
Old 03-08-2016, 07:12 AM   #24
DNA_Dan
Member
 
Location: Montana

Join Date: Nov 2008
Posts: 21
Default

Assuming the libraries range across the board, Truseq, Nextera, Kapa Biosystems, Nugen, etc., come from a huge variety of DNA/RNA sources, and have different fragment lengths/profiles - How tight can the ddPCR dial them in?

To elaborate, with the Kapa Biosystems qPCR method the tightest we can normalize our pools of samples is 1-3 fold of each other. That is to say sometimes they are pretty even, other times they can vary by up to 3-fold. Can you achieve a better normalization in a pool of very different libraries with completely difference efficiencies using the ddPCR method? If so, how tight? 80% or better? 90% or better?

Cost is a huge pill to swallow because 90K is a lot of rounds of normalization kits and technician time. However I have tried a reiterative process of normalizing, measuring with qPCR, then normalizing again, 2-3 times one after the other and what I have found is that the pipetting error in the dilutions and measurement error have a limit with how close you can "dial" samples with respect to each other. At some point the pools don't get any more "normalized", they actually start to get worse because of the handling error involved or the measurement itself.

So in essence what I am looking for is something that has the accuracy to push the flowcell to it's maximum density reproducibly every time and normalize the pools so evenly you are squeezing every bit of data possible for each sample on every run. We also do a lot of ratio pools (30% one customer, 70% another) that sort of thing. Being able to target this accurately down to 1% would allow us to put more customer samples on a run because we would have the confidence that we would hit our ratio targets more accurately.

Is the QX200 the instrument that can do this? Is this a pipe dream? Where do you feel the QX200 falls short of expectations? What are its limitations?

Ultimately if the method allows for a tighter multiplex/run scenario as I described above, the cost savings of not having to run another run on the sequencer would pay for itself. The main cost and driving factor is still the sequencing reagents.

Last edited by DNA_Dan; 03-08-2016 at 07:14 AM.
DNA_Dan is offline   Reply With Quote
Old 03-08-2016, 09:18 AM   #25
buckiseq
Junior Member
 
Location: Nashville, TN

Join Date: Aug 2015
Posts: 6
Default

Just curious, what instruments are you all using for the KAPA qPCR? We are using our QIAGEN Rotor-Gene Q and have enjoyed some pretty good success. We also have the option of using a 7500 Fast Dx and I was wondering if anyone else has used this?

Thanks!
buckiseq is offline   Reply With Quote
Old 03-08-2016, 10:32 AM   #26
cement_head
Senior Member
 
Location: Oxford, Ohio

Join Date: Mar 2012
Posts: 253
Default

Quote:
Originally Posted by buckiseq View Post
Just curious, what instruments are you all using for the KAPA qPCR? We are using our QIAGEN Rotor-Gene Q and have enjoyed some pretty good success. We also have the option of using a 7500 Fast Dx and I was wondering if anyone else has used this?

Thanks!
CFX Connect (Bio-Rad) works well. We also have a Corbett RotorGene 3000 (much older version of the "Q"), and it works on that as well.
cement_head is offline   Reply With Quote
Old 03-08-2016, 12:12 PM   #27
DNA_Dan
Member
 
Location: Montana

Join Date: Nov 2008
Posts: 21
Default

We're using an older Applied Biosystems 7900HT.
DNA_Dan is offline   Reply With Quote
Old 03-08-2016, 02:03 PM   #28
austinso
Member
 
Location: Bay area

Join Date: Jun 2012
Posts: 77
Default

Quote:
Originally Posted by DNA_Dan View Post
Assuming the libraries range across the board, Truseq, Nextera, Kapa Biosystems, Nugen, etc., come from a huge variety of DNA/RNA sources, and have different fragment lengths/profiles - How tight can the ddPCR dial them in?
Within chemistries, pretty good. We can certainly dial in 1000K/mm2 when the libraries are produced with a standardized workflow.

Quote:
To elaborate, with the Kapa Biosystems qPCR method the tightest we can normalize our pools of samples is 1-3 fold of each other. That is to say sometimes they are pretty even, other times they can vary by up to 3-fold. Can you achieve a better normalization in a pool of very different libraries with completely difference efficiencies using the ddPCR method? If so, how tight? 80% or better? 90% or better?
I think that within libraries from a given chemistry, you should be able to be within 20% of each other. The biggest reason this is the case is that you are measuring number of molecules directly, not inferring from a standard curve with a size correction. The other advantage is that you can get a better handle of concentration in the presence of primer/adaptor carryover, which you just cannot from qPCR. You can also see whether you likely have empty libraries or one with nice inserts.

Quote:
Cost is a huge pill to swallow because 90K is a lot of rounds of normalization kits and technician time. However I have tried a reiterative process of normalizing, measuring with qPCR, then normalizing again, 2-3 times one after the other and what I have found is that the pipetting error in the dilutions and measurement error have a limit with how close you can "dial" samples with respect to each other. At some point the pools don't get any more "normalized", they actually start to get worse because of the handling error involved or the measurement itself.
Absolutely. The 90K price is steep. Particularly if it was only used for library quant. But that's really not the point of the instrument.

And absolutely. Any measurement is only as good as the technique used to create the sample to be analyzed. ddPCR is not a panacea of accuracy. I played with a bunch of methods for creating those dilution series, but at the end of the day well-calibrated pipettors and consistent operator pipetting of 1:100 dilutions were the key. And if there is a consistent bias in technique, that can be accommodated in the calculations.

Quote:
So in essence what I am looking for is something that has the accuracy to push the flowcell to it's maximum density reproducibly every time and normalize the pools so evenly you are squeezing every bit of data possible for each sample on every run. We also do a lot of ratio pools (30% one customer, 70% another) that sort of thing. Being able to target this accurately down to 1% would allow us to put more customer samples on a run because we would have the confidence that we would hit our ratio targets more accurately.

Is the QX200 the instrument that can do this? Is this a pipe dream? Where do you feel the QX200 falls short of expectations? What are its limitations?

Ultimately if the method allows for a tighter multiplex/run scenario as I described above, the cost savings of not having to run another run on the sequencer would pay for itself. The main cost and driving factor is still the sequencing reagents.
Well...it is a pipe-dream to believe that any analytical measurement with manual handling steps can have 1% variance. Error is compounded over the number of handling steps.

If you are a high throughput lab, and you can use a full plate more often than not, the cost per sample is very reasonable. The minimally optimal format cost-wise is in groups of 8. If you do a lot of one offs, the cost per sample becomes higher, mainly because of the peripheral consumable costs. On that scale the QuantStudio may be better.

From a workflow perspective, once you get the hang of it, it is trivial. And if all you have to do is get the right normalization is do a single dilution series and a single measurement (or two...I usually do 1e-6 and 1e-4), then that also has value.

Best

Austin
austinso is offline   Reply With Quote
Old 03-18-2016, 03:08 AM   #29
HelenB
Junior Member
 
Location: London UK

Join Date: Feb 2015
Posts: 1
Question NEBNext versus Kappa library quant kits?

Hi

this is my first post to seqanswers and I thought it belonged in this thread as it's related. I was using Kappa kit to quantify PCR-free library preps but switched to NEBNext library quant kit as it is very much cheaper. However, it seems very variable to me (compared to Kappa) and I was wondering if others had tried this kit and what your experience of it was? Thanks
HelenB is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 12:58 PM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2020, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO