SEQanswers

Go Back   SEQanswers > Bioinformatics > Bioinformatics



Similar Threads
Thread Thread Starter Forum Replies Last Post
Very high coverage w/Ion Torrent megster Bioinformatics 3 03-11-2014 03:43 PM
Ion torrent: bacterial WGS coverage roshanbernard Ion Torrent 7 07-01-2013 04:24 PM
Ion Torrent $1000 Genome!? Benchtop Ion Proton Sequencer aeonsim Ion Torrent 88 10-28-2012 04:50 AM
telling true variants from sequencing errors using the ion torrent technology leogg23 Ion Torrent 4 08-14-2012 11:43 AM
ion torrent herrroaa Introductions 5 07-25-2011 05:36 AM

Reply
 
Thread Tools
Old 05-23-2014, 05:52 PM   #1
fourie.joubert@up.ac.za
Junior Member
 
Location: Pretoria South Africa

Join Date: Jul 2011
Posts: 1
Default Ion Torrent Ampliseq: duplicate removal / coverage / variants?

Hi Folks

I would love to get some opinions regarding duplicate removal during variant calling for the AmpliSeq Comprehensive Cancer Panel.

I have noted some previous discussions on the topic on this forum, but I am experiencing some additional issues.

We have been doing tumor vs. normal samples, and do the realignment and base quality score recalibration steps (no duplicate removal) using GATK. We are currently testing a variety of somatic variant callers (Strelka, VarScan, MuTect, JointSNVMix, SomaticSniper).

Ion Torrent states the following:

"Marking duplicate reads is not appropriate for Ion AmpliSeq data, because many independent reads are expected to share the same 5' alignment position and 3' adapter flow as each other. Marking duplicates on an Ion AmpliSeq run risks inappropriately flagging many reads that are in fact independent of one another."

When doing visual inspection of (possibly) variant positions in IGV, a lot of the variants seem to have been called based on read depth that sure look like duplicates upon inspection. There also seems to be a trend for variant positions to be at the edges of these reads .

However, if I do duplicate removal, my average coverage drops from 300x to 9x.

The duplicate reads vs. true coverage depth obviously has quite serious implications for all variant calling statistics...

Any advice / opinions / debate would be much appreciated.

Best regards!

Fourie
fourie.joubert@up.ac.za is offline   Reply With Quote
Old 05-24-2014, 05:10 AM   #2
Bukowski
Senior Member
 
Location: Aberdeen, Scotland

Join Date: Jan 2010
Posts: 388
Default

Well it's exactly like IonTorrent advise - you can't deduplicate amplicon data on any platform, neither can you do it with HaloPlex.

This means it is impossible to remove PCR artefacts from the data. This is why I much prefer hybridisation capture for these kind of studies, as deduplication is required to keep that source of false positives under control.

My best advice is if you're stuck with this system, run samples in duplicate - at least then if you have false positives from the amplification, they shouldn't be present in the other replicate.
Bukowski is offline   Reply With Quote
Old 05-24-2014, 01:06 PM   #3
nbahlis
Member
 
Location: Canada

Join Date: May 2013
Posts: 25
Default

I have the same problem. Is there a way to rum Mutect without removing duplicates (or presumed duplicates)?
nbahlis is offline   Reply With Quote
Old 07-07-2014, 09:22 AM   #4
IonTom
Member
 
Location: Germany

Join Date: Apr 2014
Posts: 32
Default

You can use the VariantTools Biocondctor package, this gives you the
number of unique in read positions for variant and reference. Using a GenomicRanges object generated from a vcf you can make it report specifically for the positions of interest. This can be done using the tally function.
IonTom is offline   Reply With Quote
Old 07-07-2014, 09:33 AM   #5
IonTom
Member
 
Location: Germany

Join Date: Apr 2014
Posts: 32
Default

Here is the code:

library(gmapR)
library(VariantTools)
library(VariantAnnotation)
library(BiocParallel)

biocParam <- MulticoreParam(workers = ncores)

fastaFile <- rtracklayer::FastaFile(referencefile)
gmapGenome <- GmapGenome(fastaFile, create=TRUE,directory = referencefolder)

print(tallied)

vcf = readVcf(vcffile)
called <- as(unlist(vcf),"VRanges")


tally.param <- TallyVariantsParam(gmapGenome, high_base_quality = 0L,minimum_mapq = 10L,
which = unique(as(called,"GRanges")),ignore_duplicates = FALSE,read_pos_breaks = c(1,10,120,330),
variant_strand = 1)


tallied = tallyVariants(bam_file,tally.param,BPPARAM = biocParam)
matched = called %in% tallied
matched_tallied = tallied %in% called

cur_called = called[matched]
tallied = unique(tallied[matched_tallied])

sampleNames(tallied) = sampleNames(cur_called)

elementMetadata(tallied) = c(elementMetadata(cur_called),elementMetadata(tallied))

print(tallied)

Last edited by IonTom; 07-07-2014 at 09:37 AM.
IonTom is offline   Reply With Quote
Old 09-11-2014, 01:50 AM   #6
dakl
Member
 
Location: sweden

Join Date: May 2009
Posts: 15
Default

Quote:
Originally Posted by nbahlis View Post
I have the same problem. Is there a way to rum Mutect without removing duplicates (or presumed duplicates)?
Yes, just skip the dedup step. It's just an upstream step to skip.
dakl is offline   Reply With Quote
Old 12-16-2016, 10:05 AM   #7
quantrix
Member
 
Location: Pennsylvania

Join Date: Jan 2011
Posts: 21
Default

Hi Fourie,
I was wondering what you ended up doing in terms of variant calling with the Ion CCP panel data? Did you just end up using the torrent variant caller? If so, how did you do the downstream analysis for filtering the variants etc? Did you use Ion Reporter?
I find it quite incredible that the Ion data seems to be quite incompatible with ANY of the variant callers like Mutect, Varscan to do paired tumor normal analysis.
Thanks for the favor of a reply.
Regards
quantrix is offline   Reply With Quote
Reply

Tags
duplicate removal, ion torrent, variants

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 08:36 PM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2019, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO