Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Yes, Pairs are always better to measure start points and distinct molecules but with single ended reads we just assume if any single replicated start point is identical it was PCR induced. This is conservative and we only count unique placements in the genome..yet again an underestimate. The above example was 5 cycles of PCR with long extensions.
    We have also performed no amplification libraries which eliminate the PCR replication problem. Taqman is another angle we use to measure at the linker step and I'll have to check our notes but I believe we are getting 300M-500M positive beads with 1ng of 200bp library. ABout 15% of our total beads (3B total) amplify which suggest few have 2 molecules in the bubble based on poisson. Not far off from your estimate of 5M molecules per 1pg of 200bp library or 5B per ng.

    In our case the DNA was not size selected so we cant blame the gel extraction or Ampure.

    We spent some time looking into peroxide formation during the shearing step and monitoring heat. Curious if any one has used the NEB preCR kit or their Fpg and Hogg1 repair enzymes to repair other forms of DNA damage like 8-oxoG or glycosic bond breaks which may be induced by this method.
    Alternatively, DNAse based forms of digesting DNA may be less caustic?

    Despite leaning high on beads in the emPCR this currently cant get a bead in every reactor without alot of clumping. Probably only 20-25% of the reactors populated. Raindance like techniques have been contemplated but would take days to make Billions of reactors.

    Comment


    • #17
      Originally posted by Nitrogen-DNE-sulfer View Post
      In our case the DNA was not size selected so we cant blame the gel extraction or Ampure.
      Not size selected at all? You must have size selected at some point, right? Otherwise you would get slammed with primer dimers, etc.

      If you had size-selected prior to going into adaptor ligation, then all the DNA has the potential to form a legitimate amplicon. Whereas if not, a lot of that 750 pg would really be contributed by fragments outside a useful size range.

      Originally posted by Nitrogen-DNE-sulfer View Post
      We spent some time looking into peroxide formation during the shearing step and monitoring heat. Curious if any one has used the NEB preCR kit or their Fpg and Hogg1 repair enzymes to repair other forms of DNA damage like 8-oxoG or glycosic bond breaks which may be induced by this method.
      Alternatively, DNAse based forms of digesting DNA may be less caustic?
      I have not. But I also have no clear idea how much DNA damage is present in a typical genomic DNA prep. If, for example, 90% of DNA spans longer than a few hundred bases contained lesions bad enough to stall out PCR replication, would we even notice?

      PCR is an exponential process, after all. The 90% of strands that could not be extended would not contribute as template to later cycles. So the 10% that did extend far enough for the reverse primer to anneal would quickly overtake those that stalled.

      Again, even if DNA damage isn't that bad in most DNA preps, it could be that a DNA prep you happened to have out near a window happened to pick up some pyrimidine dimers from the sunlight streaming in. Who knows. Heretofore all the assays I can think of would be insensitive to even fairly high levels of damage. If even 1% of the 1kb stretches of DNA in a prep are damage-free that still gives you 10 billion intact 1 kb stretches per ug of DNA. If the 10 billion work, then the 990 billion that do not will not be noticed.

      We may just be entering an era where we do need all 1 trillion molecules. If so we need to either make sure this type of damage is not an issue or find ways to mitigate the damage.
      --
      Phillip

      Comment


      • #18
        You are right IF damage occurs randomly. IF NOT all the cool events happen in the 90% that you are loosing

        Comment


        • #19
          Originally posted by What_Da_Seq View Post
          You are right IF damage occurs randomly. IF NOT all the cool events happen in the 90% that you are loosing
          There you go. Any time you lose 90% of a sample (for whatever reason), the remnant may be a biased representation of your initial sample. Because there is no reason to presume the loss is unbiased.

          In the case of many of these library construction protocols we lose more like 99.9999% of our initial sample.

          Comment


          • #20
            Originally posted by pmiguel View Post
            ...Also, even the 99% potential loss of this step only explains 2 of the >6 orders of magnitude of DNA loss in the Roche protocol...
            --
            Phillip
            Just wonder how precise the number 99% is. It was said >99%. But maybe 98.5%, 99.9%, 99.99, ...? So, that explains maybe 2, 3, 4...(unlikey to be 6 though) orders?

            Loss or damage in other steps would be interesting to know.
            Last edited by seqAll; 11-23-2009, 11:19 AM.

            Comment


            • #21
              One can design assays for DNA damage using enzymes which target it and exploit the products the enzymes create. We've used antibodies to 8-oxoG to measure photo damage on arrays and it lights up like a X-mas tree without proper scanning buffers. So we dont have to live with this. It can be measured but requires alot of time and effort tuning a quantitative assay. Most assays are specific for an a priori lesion of interest so this is a long slog and quanting total DNA next to amplifiable DNA is probably too imprecise but should be able to pick up 90% effects like you mention (1 lesion eliminates a whole molecule).

              Thymidine dimers from light through windows is one angle. I'd also point out low bind tubes and DNA adherence. While you are at, dont ship your DNA through certain zip codes or airports as the E-beam dosage varies. We're going through the steps which assume perfect DNA and seeing how poor these efficiencies are and then pushing them to highest we can achieve. If the simple hypothesis fails, we'll no doubt be digging into the damage more as the numbers in regards to loss are higher than anyone can explain.

              great thread.. very pertinent to the field.

              Comment


              • #22
                Originally posted by Nitrogen-DNE-sulfer View Post
                One can design assays for DNA damage using enzymes which target it and exploit the products the enzymes create. We've used antibodies to 8-oxoG to measure photo damage on arrays and it lights up like a X-mas tree without proper scanning buffers. So we dont have to live with this. It can be measured but requires alot of time and effort tuning a quantitative assay. Most assays are specific for an a priori lesion of interest so this is a long slog and quanting total DNA next to amplifiable DNA is probably too imprecise but should be able to pick up 90% effects like you mention (1 lesion eliminates a whole molecule).

                Thymidine dimers from light through windows is one angle. I'd also point out low bind tubes and DNA adherence. While you are at, dont ship your DNA through certain zip codes or airports as the E-beam dosage varies. We're going through the steps which assume perfect DNA and seeing how poor these efficiencies are and then pushing them to highest we can achieve. If the simple hypothesis fails, we'll no doubt be digging into the damage more as the numbers in regards to loss are higher than anyone can explain.

                great thread.. very pertinent to the field.
                Naively, I would think that a method that hydrolyzes a DNA sample into mononucleotides followed by some flavor of mass spec would give the most global overview of DNA damage. Caveats there, of course. But damage introduced by the protocol itself could be discovered and corrected for, probably.

                As to pyrimidine dimers, I seem to remember that there was an enzyme (from celery?) that reversed the dimerization process. Once we get to the point where we are repairing our DNA samples prior to assays, that would be a good enzyme to deploy. All the bacterial remedies for the pyrimidine dimer problem sounded far less appealing, if I remember this correctly.

                --
                Phillip

                Comment


                • #23
                  A side comment:
                  We found that the protocol for finishing the single stranded library did not neutralize the solution sufficiently and we were not recovering enough molecules here. Once we found that the pH was off and we had to add a lot more Na Acetate, our recovery sky-rocketed.

                  Comment


                  • #24
                    good challenge! i have been worrying about the same problem, too.

                    Comment


                    • #25
                      Super-interesting thread. One obvious thing to reduce DNA damage in library prep is to use a visible light stain and a white light box for gel based size selection versus EtBr and UV. I have had good success with nile blue sulfate, an inexpensive visible light stain. I have not directly compared an identical library processed with EtBr/UV versus Nile Blue/white light so don't know the quantitative effect of UV damage though.

                      Comment


                      • #26
                        Originally posted by greigite View Post
                        Super-interesting thread. One obvious thing to reduce DNA damage in library prep is to use a visible light stain and a white light box for gel based size selection versus EtBr and UV. I have had good success with nile blue sulfate, an inexpensive visible light stain. I have not directly compared an identical library processed with EtBr/UV versus Nile Blue/white light so don't know the quantitative effect of UV damage though.
                        Yes, we don't use EtBr or UV light boxes in the lab at all. We just use SYBR Safe and one of those "Dark Reader" boxes. We bought the latter from Clare Chemical.

                        --
                        Phillip

                        Comment


                        • #27
                          Some of the DNA is damaged during shearing. My experience has been under certain conditions, only half of the ends can be repaired on one molecule. I don't know the reason yet, but I think the DNA was turned into single stranded. The damages,I think are pretty much mechanical, not chemical. Hydro shear is pretty low speed but will generate just a little better quality DNA. Some of the damages just can be repaired. I tried to repair the single-end adaptor ligated product, but it didn't go beyond that no matter how much enzyme I put in.

                          Comment


                          • #28
                            Originally posted by nextgen View Post
                            Some of the DNA is damaged during shearing. My experience has been under certain conditions, only half of the ends can be repaired on one molecule. I don't know the reason yet, but I think the DNA was turned into single stranded. The damages,I think are pretty much mechanical, not chemical. Hydro shear is pretty low speed but will generate just a little better quality DNA. Some of the damages just can be repaired. I tried to repair the single-end adaptor ligated product, but it didn't go beyond that no matter how much enzyme I put in.
                            Could you give us a little more detail on what the conditions were where only "half of the ends can be repaired on one molecule"?

                            Any 5' or 3' overhand (locally single stranded) can be blunted by T4 polymerase's 5'->3' polymerase or 3'->5 exo-nuclease activity (respectively). The polymerase activity does rely on a 3'-OH, but T4-PNK should remove any 3' phosphate that would otherwise block the polymerase.

                            So, if your end repair regimen includes both T4 polymerase and T4 polynucleotide kinase (and most do) the remaining culprit would be non-phosphate, non-hydroxyl DNA fragment ends.

                            I have been able to find little in the literature about this possibility, the classic Richard and Boyer paper [1] mentions it and I summarized their results here:

                            Techniques and protocol discussions on sample preparation, library generation, methods and ideas


                            As far as a comparison to the hydroshear, it seems like the results published by Oefner, et al. in their 1996 publication describing the prototype of the hydroshear [2] where 20-40% of fragments produced could be ligated with no end repair suggests it is qualitatively different from sonication.

                            1. Richards OC, Boyer PD (1965) Chemical Mechanism of Sonic Acid Alkaline and Enzymic Degradation of DNA. Journal of Molecular Biology 11: 327-240.
                            2. Oefner PJ, HunickeSmith SP, Chiang L, Dietrich F, Mulligan J, et al. (1996) Efficient random subcloning of DNA sheared in a recirculating point-sink flow system. Nucleic Acids Research 24: 3879-3886.

                            --
                            Phillip

                            Comment

                            Latest Articles

                            Collapse

                            • seqadmin
                              Strategies for Sequencing Challenging Samples
                              by seqadmin


                              Despite advancements in sequencing platforms and related sample preparation technologies, certain sample types continue to present significant challenges that can compromise sequencing results. Pedro Echave, Senior Manager of the Global Business Segment at Revvity, explained that the success of a sequencing experiment ultimately depends on the amount and integrity of the nucleic acid template (RNA or DNA) obtained from a sample. “The better the quality of the nucleic acid isolated...
                              03-22-2024, 06:39 AM
                            • seqadmin
                              Techniques and Challenges in Conservation Genomics
                              by seqadmin



                              The field of conservation genomics centers on applying genomics technologies in support of conservation efforts and the preservation of biodiversity. This article features interviews with two researchers who showcase their innovative work and highlight the current state and future of conservation genomics.

                              Avian Conservation
                              Matthew DeSaix, a recent doctoral graduate from Kristen Ruegg’s lab at The University of Colorado, shared that most of his research...
                              03-08-2024, 10:41 AM

                            ad_right_rmr

                            Collapse

                            News

                            Collapse

                            Topics Statistics Last Post
                            Started by seqadmin, Yesterday, 06:37 PM
                            0 responses
                            8 views
                            0 likes
                            Last Post seqadmin  
                            Started by seqadmin, Yesterday, 06:07 PM
                            0 responses
                            8 views
                            0 likes
                            Last Post seqadmin  
                            Started by seqadmin, 03-22-2024, 10:03 AM
                            0 responses
                            49 views
                            0 likes
                            Last Post seqadmin  
                            Started by seqadmin, 03-21-2024, 07:32 AM
                            0 responses
                            66 views
                            0 likes
                            Last Post seqadmin  
                            Working...
                            X