SEQanswers

Go Back   SEQanswers > Bioinformatics > Bioinformatics



Similar Threads
Thread Thread Starter Forum Replies Last Post
DESeq Normalization Question turnersd Bioinformatics 5 11-25-2014 11:19 PM
DESeq: question about using HTSeq counts JonB Bioinformatics 10 06-09-2014 04:20 AM
edgeR VS deseq normalization narges Bioinformatics 0 07-04-2013 06:27 AM
Normalization of miRNA counts across different samples using miRDeep2 kmherb RNA Sequencing 0 12-06-2012 09:54 AM
DESeq: Read counts vs. BP counts burkard Bioinformatics 0 08-06-2010 12:52 AM

Reply
 
Thread Tools
Old 01-29-2015, 03:21 PM   #1
alabadorf
Junior Member
 
Location: Boston

Join Date: Jun 2010
Posts: 3
Default DESeq normalization and sample counts of zero

First, apologies if this question is posted elsewhere, but I could not find an answer to my question.

I am using the DESeq normalization procedure on a set of 100 mRNA-Seq samples aligned with STAR and counted with htseq-count. I noticed in the normalization procedure that only genes where all samples have non-zero counts are used in the calculation of size factors. This means that only 15k/52k of my detected genes are used for this calculation, while 21k/52k genes have 5 or fewer zero counts. If I modify the normalization routine to include these extra genes by omitting the zero counts from the geometric mean calculation, the size factors change, where some samples increase, others decrease, and others remain the same. The relative order of the samples by size factor also changes. Overall, the size factors get smaller when increasing numbers of zeros are included, which I think makes sense given that more lowly expressed genes are included but I'm not entirely sure. Analyzing the normalized counts while allowing genes with zeros to be included modestly changes downstream DE results, but I haven't seen any red flags yet so I won't post any examples.

My question: besides numerical reasons, what is the rationale for omitting genes that have even a single zero count? My feeling is that omitting any samples with zero counts biases the normalization to use only genes with high expression and does not take into account lowly, but confidently, expressed genes. Since I have so many samples, I worry this strategy might too severely penalize genes with outlier counts of zero. Is there danger in modifying the routine to allow, say, at most 5/100 samples to have zeros?

Thanks.
alabadorf is offline   Reply With Quote
Old 01-29-2015, 03:56 PM   #2
GenoMax
Senior Member
 
Location: East Coast USA

Join Date: Feb 2008
Posts: 6,575
Default

Check this thread for a discussion of "0" counts: http://seqanswers.com/forums/showthread.php?t=48239
GenoMax is offline   Reply With Quote
Old 01-29-2015, 04:43 PM   #3
alabadorf
Junior Member
 
Location: Boston

Join Date: Jun 2010
Posts: 3
Default

Thanks for the reply. I understand that zero counts cannot be confidently distinguished as absent or artifact. For most of the genes that have 5 or fewer zero count samples, the remaining samples have significant (100-1000+) read depth, leading me to believe that these are genes that should be included in the normalization routine. As the number of samples with zero counts increases, the overall counts of the remaining samples decreases, which is what we would expect as we approach the sequencing depth detection limit. This is not the case for the genes with only a few zero counts.
alabadorf is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 04:59 AM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2017, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO