View Single Post
Old 03-30-2011, 01:41 PM   #4
pmiguel
Senior Member
 
Location: Purdue University, West Lafayette, Indiana

Join Date: Aug 2008
Posts: 2,317
Default

Hi Thibault,
The only circumstances under which going to higher densities of beads would result in high % of PCR duplications would be:
(1) There are some high intensity beads bleeding signal across their neighbors--hence duplicate sequences. You could actually check this if your informatic skills were mighty enough. The read names are coordinates--so if your PCR duplicates frequently are seen spatially adjacent, then you would infer that you have a signal bleed over issue.
(2) Your library is bottomed out. That is you started with 0.1 pg of original template with an average size (say) of 100 bp--well that is 1 million DNA fragments. If you do a sequencing run and produce 100 million reads from the library that you made from the initial 1 million DNA fragments you are guaranteed to have an average of 100 PCR duplicates of each. The above example is extreme. Unlikely you would bottom-out a library that much. But that is the idea. If, at any point in your protocol, you bottleneck the complexity of your library below the number of reads you ultimately produce, then you will end up with PCR duplicates.
(3) Amplicon contamination. If any of your ePCR amplicons end up in your library construction area, they will contaminate subsequent libraries you construct. This isn't directly related to numbers of beads. But if you have a low level of amplicon contamination in a library you might not really notice it until you went deeper in the library.

--
Phillip
pmiguel is offline   Reply With Quote