Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by kerplunk412 View Post
    I have noticed something similar when quantifying gDNA by Nanodrop. What I saw was values that would vary quite a bit when reading different "drops" from the same tube of gDNA. This was solved by vortexing the DNA for 10 seconds. The idea is that the genomic DNA molecules are so large that one microliter might have varying amount of these large DNA molecules. Following vortexing the gDNA is in much smaller fragments, which allows it to exist more evenly in solution such that every microliter will have a much more similar amount of DNA. Think of it of grabbing handfuls of sand versus handfuls of medium sized rocks and weighing them. The handful of sand will be very close to the same weight each time, but the rocks will vary much more. I haven't tested the theory about large DNA vs sheared DNA, but we have tested vortexing DNA for 10 seconds prior to reading on the Nanodrop and it definitely results in much more consistent readings.
    Mass of genome in one human cell is 6.6 pg, so in a DNA solution of 10 ng/ul we would have equivalent of DNA from 1515 cells which would be 45,450,000 fragments of 100kb. Most standard extraction methods will result in fragments less than 100kb. So, I do not see how one can justify that 45.5 million fragments in 1ul will aggregate in a solution to give 10x variation in consecutive reads. I think just a gentle flick would be enough to have a homogenous solution (if sample was frozen) and vortexing definitely would damage large DNA fragments.

    Comment


    • #17
      Originally posted by nucacidhunter View Post
      Linear detection range of PicoGreen is four orders of magnitude in 1 ng/ml to 1000ng/ml DNA concentration. As far as one calibrates fluorometer at 0 and 1000 range there is no need to any other concentration in between or standard curve. It seems to be waste of money and time. With correct calibration one needs only to multiply the fluorescence value in dilution factor to calculate original concentration of DNA sample.
      The 500 reaction kit is cheap (60 cents/sample) and it takes a matter of minutes to add a few extra standards. You are correct regarding the linear detection range; however, I think you should review linear regression.

      Comment


      • #18
        Originally posted by nucacidhunter View Post
        Mass of genome in one human cell is 6.6 pg, so in a DNA solution of 10 ng/ul we would have equivalent of DNA from 1515 cells which would be 45,450,000 fragments of 100kb. Most standard extraction methods will result in fragments less than 100kb. So, I do not see how one can justify that 45.5 million fragments in 1ul will aggregate in a solution to give 10x variation in consecutive reads. I think just a gentle flick would be enough to have a homogenous solution (if sample was frozen) and vortexing definitely would damage large DNA fragments.
        Your logic makes sense to me, so maybe the difference in size before and after vortexing does not explain my observations. However, I tested this fairly rigorously and a few of my colleagues have tried this as well, so I can say with confidence that with the gDNA samples I was working with a gentle flick was not enough to get a consistent reading, vortexing was required. As far as damaging the DNA, I am pretty sure 10 seconds of vortexing will not cause enough DNA fragmentation to matter for most NGS applications. If it was that easy to fragment DNA into small pieces no one would need to buy a Covaris!

        Edit: I should also mention that the variation seen before vortexing was at most about 2x. Variation after vortexing was ~1%.
        Last edited by kerplunk412; 03-05-2015, 04:52 PM.

        Comment

        Latest Articles

        Collapse

        • seqadmin
          Strategies for Sequencing Challenging Samples
          by seqadmin


          Despite advancements in sequencing platforms and related sample preparation technologies, certain sample types continue to present significant challenges that can compromise sequencing results. Pedro Echave, Senior Manager of the Global Business Segment at Revvity, explained that the success of a sequencing experiment ultimately depends on the amount and integrity of the nucleic acid template (RNA or DNA) obtained from a sample. “The better the quality of the nucleic acid isolated...
          03-22-2024, 06:39 AM
        • seqadmin
          Techniques and Challenges in Conservation Genomics
          by seqadmin



          The field of conservation genomics centers on applying genomics technologies in support of conservation efforts and the preservation of biodiversity. This article features interviews with two researchers who showcase their innovative work and highlight the current state and future of conservation genomics.

          Avian Conservation
          Matthew DeSaix, a recent doctoral graduate from Kristen Ruegg’s lab at The University of Colorado, shared that most of his research...
          03-08-2024, 10:41 AM

        ad_right_rmr

        Collapse

        News

        Collapse

        Topics Statistics Last Post
        Started by seqadmin, 03-27-2024, 06:37 PM
        0 responses
        13 views
        0 likes
        Last Post seqadmin  
        Started by seqadmin, 03-27-2024, 06:07 PM
        0 responses
        11 views
        0 likes
        Last Post seqadmin  
        Started by seqadmin, 03-22-2024, 10:03 AM
        0 responses
        53 views
        0 likes
        Last Post seqadmin  
        Started by seqadmin, 03-21-2024, 07:32 AM
        0 responses
        69 views
        0 likes
        Last Post seqadmin  
        Working...
        X