Go Back   SEQanswers > Sequencing Technologies/Companies > Illumina/Solexa

Similar Threads
Thread Thread Starter Forum Replies Last Post
454 sample normalisation HMcC-TGAC 454 Pyrosequencing 2 01-25-2012 03:46 AM
Normalisation DrDTonge Sample Prep / Library Generation 3 06-06-2011 07:13 AM
Size-selection in DSN normalisation anunn RNA Sequencing 4 10-01-2010 09:22 AM
Illumina DSN normalisation asesay Illumina/Solexa 2 09-20-2010 09:34 AM

Thread Tools
Old 04-04-2013, 06:42 AM   #1
Junior Member
Location: UK

Join Date: Mar 2013
Posts: 7
Default Nextera XT Normalisation

Hi all,

I have been using the Nextera XT kit to do my library preps for MiSeq sequencing of fairly low complexity libraries and have been getting very variable results as a result of variable cluster density.

1359 Kclusters per mm^2
606 Kclusters per mm^2
312 Kclusters per mm^2

Admittedly, these are different libraries but they are not so dissimilar as to explain such wide variability.

I suspect that the normalisation beads are not functioning as I expected. Is anyone having any problems with the normalisation beads?

And is there anyone finding that the normalisation beads are working well and have any words of wisdom?

JoeChris38 is offline   Reply With Quote
Old 04-04-2013, 04:15 PM   #2
Location: Philadelphia

Join Date: Dec 2012
Posts: 15

In our [somewhat limited] experience with Nextera XT, we have also had some problems with the bead based normalization (in our case our case we were getting over-clustering problems). We talked with Illumina about it, and some points of interest are:

-You MUST use the exact shaker machine that they specify.
-The L1 buffer must be warmed before use, and prepared exactly as specified.
-Beads must be extremely well vortexed for the entire specified time.
-This bead based approach is still somewhat in its infancy and is not completely optimized.

With this being said, we have decided to start normalizing the DNA concentrations manually, without the bead based approach. I hope this helps.
MicroBio is offline   Reply With Quote
Old 02-28-2014, 06:48 AM   #3
Shlomo Blum
Location: Israel

Join Date: Nov 2012
Posts: 10

How? Could you share a protocol or reference?
Shlomo Blum is offline   Reply With Quote
Old 05-14-2014, 05:53 PM   #4
Junior Member
Location: Palm Bay, Fl

Join Date: Nov 2009
Posts: 6

We usually are processing les than 10 libraries. In this case, we have found that manual normalization is both easier and gives more predictable cluster densities than the bead normalization procedure. Also, we find the distribution of indices to be much more even. So if you aren't performing high sample throughput, I would recommend giving manual normalization a try.
rwinegar is offline   Reply With Quote
Old 05-22-2014, 01:30 AM   #5
Junior Member
Location: Olsztyn, Poland

Join Date: May 2011
Posts: 7

We found, that Kappa normalizations works fine. Bead based normalization never gave us an equal reads distribution among libraries.

Last edited by tymek666; 05-22-2014 at 07:25 AM.
tymek666 is offline   Reply With Quote
Old 05-26-2014, 02:58 AM   #6
Location: Moscow

Join Date: Aug 2012
Posts: 20

I've carried out the protocol several times with various shaking speeds(1200-1800 rpm) and always got results ~800K/mm^2 cluster density. Only once I got a little overclustered ~1300K/mm^2 with v.2 reagents(but I attribute it to incorrect size selection parameters), but still managed to get good results. Nextera XT is a very optimised protocol that is meant to avoid any kind of error at every stage (provided you do everything correctly).
Etherella is offline   Reply With Quote

beads, cluster density, miseq, nextera xt, normalisation

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

All times are GMT -8. The time now is 09:09 PM.

Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2021, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO