SEQanswers

Go Back   SEQanswers > Sequencing Technologies/Companies > Illumina/Solexa



Similar Threads
Thread Thread Starter Forum Replies Last Post
Hadoop for NGS sequence analysis samanta General 0 09-03-2011 12:07 PM
illumina smallRNA adapter sequence for downstram analysis + miRNA analysis steps ndeshpan Bioinformatics 2 06-14-2011 10:44 PM
Sequencing Analysis Viewer trick. pmiguel Illumina/Solexa 1 05-04-2011 08:20 AM
Are Illumina missing a trick over cluster density? henry.wood Illumina/Solexa 13 01-13-2011 01:18 PM
MicroArray and Sequence Analysis BioSlayer Bioinformatics 4 04-06-2010 05:58 PM

Reply
 
Thread Tools
Old 12-08-2011, 01:05 PM   #1
pmiguel
Senior Member
 
Location: Purdue University, West Lafayette, Indiana

Join Date: Aug 2008
Posts: 2,317
Default Sequence Analysis Viewer (SAV) Trick

I use SAV extensively to monitor and assess a run. Especially during a run in progress. One issue I want to have a feel for is the relationship between cluster density and quality of the resultant data. Using SAV this is easy to graph. However there is an issue. For example:



Ignoring densities less than 600 K/mm2 on the graph, all looks as expected. As density increases the %>q30 drops. Above 900, things get a little crazy, with some panels dropping dramatically, while others look reasonable.

The problem is that there a set of panels that appear to be around 550 K/mm2 -- a modest density. If you look at image view these panels do not appear to have densities much lower than other panels in the lane. Instead it just looks like the cluster calling software decided to give up. Okay, irritating, but just something you deal with when pushing the upper cluster density limits.

However to get a sense of the relationship between true cluster density versus quality what is one to do? The cluster density numbers are crocked at very high densities. But how about the cluster diameters? The higher the densities, the more starved for clustering reagents a cluster will be and the more "stunted" its growth. The bizarrely named "FWHM" (Full width at half maximum) is a measure of the diameter of something roughly circular with somewhat nebulous borders. So you define the border as the average distance of a point with 1/2 the maximum intensity from the point with maximum intensity (presumably the center). How about using that?

Works great:



--
Phillip
pmiguel is offline   Reply With Quote
Old 12-08-2011, 01:16 PM   #2
Heisman
Senior Member
 
Location: St. Louis

Join Date: Dec 2010
Posts: 535
Default

Not gonna lie, that's pretty cool.
Heisman is offline   Reply With Quote
Old 12-08-2011, 04:41 PM   #3
csquared
Member
 
Location: Huntsville, AL

Join Date: May 2008
Posts: 67
Default

FWHM is definitely a very useful metric (except in the index read). We find it most useful to compare it between all 4 bases. The more even and overlapping the plots are, the better the run.

In your run, it looks like lane 8 had a wide variation in cluster density per tile compared to lane 1 or 2 that look more uniform but still high cluster density. Your graph matches our data that 900k-950k clusters is maximizes high quality data with high output and as you go over 1M clusters, things drop off rapidly.

You probably know this but the FWHM value is not only influenced by the clusters but also the focus quality. Newer instruments with the higher megapixel cameras are better with focus/FWHM metrics/data output than older instruments.
__________________
HudsonAlpha Institute for Biotechnology
http://www.hudsonalpha.org/gsl
csquared is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 07:34 AM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2021, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO