![]() |
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
454 sample normalisation | HMcC-TGAC | 454 Pyrosequencing | 2 | 01-25-2012 03:46 AM |
Normalisation | DrDTonge | Sample Prep / Library Generation | 3 | 06-06-2011 07:13 AM |
Size-selection in DSN normalisation | anunn | RNA Sequencing | 4 | 10-01-2010 09:22 AM |
Illumina DSN normalisation | asesay | Illumina/Solexa | 2 | 09-20-2010 09:34 AM |
![]() |
|
Thread Tools |
![]() |
#1 |
Junior Member
Location: UK Join Date: Mar 2013
Posts: 7
|
![]()
Hi all,
I have been using the Nextera XT kit to do my library preps for MiSeq sequencing of fairly low complexity libraries and have been getting very variable results as a result of variable cluster density. 1359 Kclusters per mm^2 606 Kclusters per mm^2 312 Kclusters per mm^2 Admittedly, these are different libraries but they are not so dissimilar as to explain such wide variability. I suspect that the normalisation beads are not functioning as I expected. Is anyone having any problems with the normalisation beads? And is there anyone finding that the normalisation beads are working well and have any words of wisdom? Thanks! |
![]() |
![]() |
![]() |
#2 |
Member
Location: Philadelphia Join Date: Dec 2012
Posts: 15
|
![]()
In our [somewhat limited] experience with Nextera XT, we have also had some problems with the bead based normalization (in our case our case we were getting over-clustering problems). We talked with Illumina about it, and some points of interest are:
-You MUST use the exact shaker machine that they specify. -The L1 buffer must be warmed before use, and prepared exactly as specified. -Beads must be extremely well vortexed for the entire specified time. -This bead based approach is still somewhat in its infancy and is not completely optimized. With this being said, we have decided to start normalizing the DNA concentrations manually, without the bead based approach. I hope this helps. |
![]() |
![]() |
![]() |
#3 |
Member
Location: Israel Join Date: Nov 2012
Posts: 10
|
![]()
How? Could you share a protocol or reference?
|
![]() |
![]() |
![]() |
#4 |
Junior Member
Location: Palm Bay, Fl Join Date: Nov 2009
Posts: 6
|
![]()
We usually are processing les than 10 libraries. In this case, we have found that manual normalization is both easier and gives more predictable cluster densities than the bead normalization procedure. Also, we find the distribution of indices to be much more even. So if you aren't performing high sample throughput, I would recommend giving manual normalization a try.
|
![]() |
![]() |
![]() |
#5 |
Junior Member
Location: Olsztyn, Poland Join Date: May 2011
Posts: 7
|
![]()
We found, that Kappa normalizations works fine. Bead based normalization never gave us an equal reads distribution among libraries.
Last edited by tymek666; 05-22-2014 at 07:25 AM. |
![]() |
![]() |
![]() |
#6 |
Member
Location: Moscow Join Date: Aug 2012
Posts: 20
|
![]()
I've carried out the protocol several times with various shaking speeds(1200-1800 rpm) and always got results ~800K/mm^2 cluster density. Only once I got a little overclustered ~1300K/mm^2 with v.2 reagents(but I attribute it to incorrect size selection parameters), but still managed to get good results. Nextera XT is a very optimised protocol that is meant to avoid any kind of error at every stage (provided you do everything correctly).
|
![]() |
![]() |
![]() |
Tags |
beads, cluster density, miseq, nextera xt, normalisation |
Thread Tools | |
|
|