SEQanswers

Go Back   SEQanswers > Sequencing Technologies/Companies > Illumina/Solexa



Similar Threads
Thread Thread Starter Forum Replies Last Post
Frankenstein MiSeq 2x400bp runs brineybs Illumina/Solexa 5 09-20-2012 05:35 AM
gsRunBrowser not able to carry out full processing ketan_bnf 454 Pyrosequencing 1 07-25-2011 07:14 AM
DNA contamination wolfypita RNA Sequencing 0 03-16-2011 06:27 PM
Vector contamination? gconcepcion Illumina/Solexa 5 02-08-2011 05:14 AM
tRNA contamination Newie RNA Sequencing 0 01-08-2011 06:40 AM

Reply
 
Thread Tools
Old 04-25-2013, 03:04 PM   #41
Harlon
Junior Member
 
Location: Canada

Join Date: Apr 2013
Posts: 6
Default

Quote:
Originally Posted by kameran View Post
Hi Tom,

Studies regarding barcode integrations and degradation rate for run to run carryover have not been performed. However, customers have reported steady decreases from wash to wash. Anecdotal evidence suggests that, if carryover was 1 in 1000, it would be more like 1 in 1000000 the next time. Thlis assumes there isn’t template that has dried and gotten “stuck” somewhere in the system. The more rinsing we can do on the MiSeq, the lower the carryover rate is expected to be.
Hi Kameron

Thanks for the link to the technical bulletin. We had already implemented all of the fixes you suggest, including a maintenance wash between every run, prior to reporting our carry-over problem.

We have noticed that carry-over disappears within 2-3 runs at the read depths we are doing. Suggesting that additional washes would help. That said, I am really hoping Illumina will come up with a better solution than 'run 2 maintenance washes between runs'. Aside from the inconvenience, it increases the total time for a 150bp paired end run + wash to >24 hours. Since we're running our MiSeq pretty much every day, this would be a big hit to productivity for us.

Cheers
h
Harlon is offline   Reply With Quote
Old 04-26-2013, 04:46 AM   #42
thomasblomquist
Member
 
Location: Ohio

Join Date: Jul 2012
Posts: 68
Default

This message string has been very helpful for our specific purposes. So, I'm thinking that if we have three batches of barcoding reagents that we rotate through that should be sufficient amount of inter-run washes to eliminate the tubing contamination of barcoded product from a previous runs that used the same barcode. I.e. every third to fourth run may have the same barcode.

Thanks again. Very helpful.

-Tom
thomasblomquist is offline   Reply With Quote
Old 04-26-2013, 04:37 PM   #43
nhunkapiller
Junior Member
 
Location: Bay Area

Join Date: Dec 2012
Posts: 6
Default

Is there a consensus as to whether this issues affects the HiSeq 2500 in Rapid Mode?
nhunkapiller is offline   Reply With Quote
Old 05-02-2013, 10:19 AM   #44
james hadfield
Moderator
Cambridge, UK
Community Forum
 
Location: Cambridge, UK

Join Date: Feb 2008
Posts: 221
Default

The answer is in Kameran's post, although why Illumina only put this on iCom and not out there publicly is a little galling.

Quote:
Originally Posted by kameran View Post
Hi everyone, this is Kameran and I’m part of the Illumina Technical Support team. Illumina has posted a bulletin, “Best Practices for High Sensitivity Applications: Minimizing Sample Carryover”, which can be found by following this link:
https://icom.illumina.com/MyIllumina...applications-m


In summary, run to run sample carryover is more likely to have an effect on very low detection threshold applications. Our internal testing found that, when the instrument is properly washed and maintained, sample carryover typically remained below 0.1%, representing 1 read in 1000. Carryover may arise from a number of steps in the sequencing workflow, including incomplete removal of template DNA before the next run and various steps of library preparation. Proper maintenance procedures with water and Tween are recommended to minimize carryover. We currently do not advise the use of other wash additives, such as bleach, Triton or other decontamination solutions.

Of course, we’re always available by phone or email ([email protected]) to discuss any additional questions or concerns.
They say "To avoid direct introduction of samples on the HiSeq 1500/2500, the TruSeq® Rapid Duo cBot Sample Loading Kit (CT-402-4001) can be used to load samples onto the flow cell. The Rapid Duo kit contains disposable parts, thereby limiting the opportunity for carryover during clustering."

I think it is clear there is a fail-safe work-around for HiSeq 2500. Now we just need something a little better for MiSeq and we're all happy once more!
james hadfield is offline   Reply With Quote
Old 05-02-2013, 12:54 PM   #45
id0
Senior Member
 
Location: USA

Join Date: Sep 2012
Posts: 130
Default

I am just trying to understand this issue more. If the contamination is less than 1%, it seems largely irrelevant. Based on my experience, trying to call variants of that frequency will yield a huge amount of false positives. Isn't the sequencing error rate a much bigger problem that makes this a moot point?
id0 is offline   Reply With Quote
Old 05-02-2013, 05:45 PM   #46
snetmcom
Senior Member
 
Location: USA

Join Date: Oct 2008
Posts: 152
Default

Quote:
Originally Posted by id0 View Post
I am just trying to understand this issue more. If the contamination is less than 1%, it seems largely irrelevant. Based on my experience, trying to call variants of that frequency will yield a huge amount of false positives. Isn't the sequencing error rate a much bigger problem that makes this a moot point?
1% contamination means you will have 150,000 reads of contamination in a normal run. Not 1% variant frequency.

It's not trivial for certain projects.
snetmcom is offline   Reply With Quote
Old 05-02-2013, 06:33 PM   #47
thomasblomquist
Member
 
Location: Ohio

Join Date: Jul 2012
Posts: 68
Default

Some types of genetic variation are easily detectable beyond the typical single nucleotide variant limit of detection in sequencing. Translocations, indels, repeats, etc. Importantly, their relative abundance, even at very low frequencies can be clinically relevant.
thomasblomquist is offline   Reply With Quote
Old 05-02-2013, 10:33 PM   #48
MrGuy
Member
 
Location: earth

Join Date: Mar 2009
Posts: 68
Default

Even getting a single read of something errant is enough to shut. down. everything. Thus having even the slightest risk of detectable contamination means having proper controls in place to properly limit these contamination events as a known entity.
MrGuy is offline   Reply With Quote
Old 07-16-2013, 03:38 PM   #49
nhunkapiller
Junior Member
 
Location: Bay Area

Join Date: Dec 2012
Posts: 6
Default

Reviving this thread in hopes that other users out there have found a way to solve this problem on the HiSeq2500 without rotating barcodes or clustering on the cBot. We have not, despite operationally switching to 2 water washes for every run.

I have barcode contamination data on literally hundreds of runs on many instruments where we run both sides around the clock. The problem is absolutely due to carryover from the previous run based on the barcode contamination patterns.

Any experimental wash procedures that are working better out there? Unfortunately, we are very sensitive to this problem, even in the 0.2-0.5% carryover range.
nhunkapiller is offline   Reply With Quote
Old 09-12-2013, 10:05 AM   #50
thomasblomquist
Member
 
Location: Ohio

Join Date: Jul 2012
Posts: 68
Default

Quote:
Originally Posted by nhunkapiller View Post
Reviving this thread in hopes that other users out there have found a way to solve this problem on the HiSeq2500 without rotating barcodes or clustering on the cBot. We have not, despite operationally switching to 2 water washes for every run.

I have barcode contamination data on literally hundreds of runs on many instruments where we run both sides around the clock. The problem is absolutely due to carryover from the previous run based on the barcode contamination patterns.

Any experimental wash procedures that are working better out there? Unfortunately, we are very sensitive to this problem, even in the 0.2-0.5% carryover range.
Any update on this? We will be ramping up our sequencing efforts and need to have this issue resolved as well. If not, we may jump back to an Ion Torrent approach. Even though it is more expensive, it does not have this very serious flaw.

-Tom
thomasblomquist is offline   Reply With Quote
Old 09-13-2013, 03:52 AM   #51
pmiguel
Senior Member
 
Location: Purdue University, West Lafayette, Indiana

Join Date: Aug 2008
Posts: 2,218
Default

Seems like contamination has to derive from the fluidics that deliver the denatured library to the flowcell. Standard procedure to change gaskets (and, of course, flow cells). During this process wiping down the area where the gasket sits could be added prior to placing the new gasket.

Also you could use the "check" (flow check) capability of the instrument software to run extra 1N NaOH/water flows through the sample input ports. Seems like this could be done pretty quickly then followed with a few water flows.

--
Phillip
pmiguel is offline   Reply With Quote
Old 04-15-2014, 05:56 PM   #52
KristenC
Junior Member
 
Location: New Zealand

Join Date: Nov 2013
Posts: 7
Default Targeted RNA assay

Hi,

Again reviving the topic...

We are running a targeted RNA assay on our Miseq, and also have that issue with the carry-over of previously run libraries. We have two sets of about 40 targets and we alternate those two panels each run. We use 4 adapters per run for sample libraries and screen for the other 20 adapters to check for contamination. Adapters used for sample libraries are not used for sample libraries for at least the following two runs. After each run, a post-run wash is performed followed by a stand-by wash.

By screening for the adapters that were not used in the present run, we can see that despite all our measures, we keep picking up reads that align to our targets, beit not exceeding Illumina guidelines. The distribution of those contaminant reads along the targets is similar to what we see in the actual sample libraries (High expressors have more 'contaminant' reads than low expressors).

But it makes me wonder whether a count of let's say 5 is an actual detected count, or whether that may be due to contamination and as such should be assigned an 'undetected'. For expressors that are consistently low, there is practically no contamination present in the unused adapters, but for those that vary between 0 and 1000 counts, a count of 5 or even 20 is hard to interpret.

So I wanted to ask if anybody has experience with setting a threshold below which you consider a target to be undetected based on contamination profiles. E.g. would you use a theoretical average contamination per target per run based on the levels in the other-than-sample-library adapters?
KristenC is offline   Reply With Quote
Old 04-15-2014, 06:13 PM   #53
thomasblomquist
Member
 
Location: Ohio

Join Date: Jul 2012
Posts: 68
Default

Quote:
Originally Posted by KristenC View Post
Hi,

Again reviving the topic...

We are running a targeted RNA assay on our Miseq, and also have that issue with the carry-over of previously run libraries. We have two sets of about 40 targets and we alternate those two panels each run. We use 4 adapters per run for sample libraries and screen for the other 20 adapters to check for contamination. Adapters used for sample libraries are not used for sample libraries for at least the following two runs. After each run, a post-run wash is performed followed by a stand-by wash.

By screening for the adapters that were not used in the present run, we can see that despite all our measures, we keep picking up reads that align to our targets, beit not exceeding Illumina guidelines. The distribution of those contaminant reads along the targets is similar to what we see in the actual sample libraries (High expressors have more 'contaminant' reads than low expressors).

But it makes me wonder whether a count of let's say 5 is an actual detected count, or whether that may be due to contamination and as such should be assigned an 'undetected'. For expressors that are consistently low, there is practically no contamination present in the unused adapters, but for those that vary between 0 and 1000 counts, a count of 5 or even 20 is hard to interpret.

So I wanted to ask if anybody has experience with setting a threshold below which you consider a target to be undetected based on contamination profiles. E.g. would you use a theoretical average contamination per target per run based on the levels in the other-than-sample-library adapters?
Hi Kristen,

Some time back you stumbled across our paper on targeted RNA seq: http://seqanswers.com/forums/showthread.php?t=35379

The use of the competitive internal standard molecules actually competes out and will provide a relative measure of the contamination. For example, it can be hard to know in PCR amplified libraries if 5, 10, 20 or even 100 or 1000 reads of a template is true. However, if you have a standard competitor present in the library prep (say 100 molecules), and no native template present during the prep, and your reads are 5 native to 1000 standard for that run (the 5 being contaminant carry over from the tubing on the illumina, you can calculate the actual number of molecules:

5/1000 * 100 = less than 1 molecule (0.5) of native in the actual reaction ~~~ likely equals contaminant.

This has been our approach. The competitor standard gives us a reality check to the biases of NGS library prep as well as carry over contamination for the Illumina platforms.

Regards,

-Tom Blomquist
thomasblomquist is offline   Reply With Quote
Old 04-15-2014, 06:38 PM   #54
bunce
Member
 
Location: Perth

Join Date: Sep 2012
Posts: 55
Default

Hi Kristen,
Working in the field of ancient (and low copy number) DNA often in mixed samples - we are paranoid about this kind of contamination. We primarily do amplicon sequencing and a few years back after picking up carry over in 454 (and PGM...now MiSeq) we decided not to reuse index combinations..... this means we have a hefty primer bill, but contamination is bioinformatically screened-out as each library is unique.

We now do our sequencing on the MiSeq and in addition to the unique indexes we also do the new bleach "post run" wash.... I have no idea how effective this wash is (there are figures from illumina if you believe them!) but it is very easy to implement and should be your first option.

We do a few other things to minimise contamination such as making NaOH up in our pre-PCR area. From your description of the contamination (where you don't re-use tags for two runs) it may well be that your environment/set-up that is contaminated and may not be the instrumentation.

To answer your question about threshold cut-offs; this is next to impossible to put a reliable number on - the nature of background contamination is sporadic - Tom Blomquist's suggestion is a good one if you need to quantify it.... but our solution was to side-step the issue through unique index combinations.

hope this helps?
Mike
bunce is offline   Reply With Quote
Old 04-15-2014, 07:13 PM   #55
KristenC
Junior Member
 
Location: New Zealand

Join Date: Nov 2013
Posts: 7
Default

Hi Tom and Mike,

Thank you for your swift reply.
Tom, I really like your set-up with the internal standard as a competitor, already when I first read your paper a while ago. But we were at that time already running experiments within a project (I started halfway in the project), for which I could not implement your strategy anymore. I am now analyzing all the data and need to deal with the contamination as it is now. I will however use the results to improve our set-up for the next project, including the recommendation to use an internal standard to overcome these issues.

Mike,
We are going to implement the bleach post run for our new project, but will keep screening for previously used adapters to see if it makes a difference. We did see a big difference when we started doing the standby wash procedure after every run. Levels drop off to a very low level after 2 runs (less than 0.1%), but do remain at that level after that, which is why I wanted to try and use that level as a sort of background.

As for the set-up, we have a number of rules in place to avoid contamination in the sample prep, but there's always room for improvement. I will check where our wash solution are made, maybe we can improve on that.
We once had a primer fail for one of our targets, but decided to run the library anyway, and the target had 0 counts, which I thought indicated that we weren't doing too bad contamination wise ... :-).
I'm afraid our budget wouldn't allow us to order new indexes every time. I'll see if we can get some new ones in though, 24 may not be enough...


Thanks for your help already!
KristenC is offline   Reply With Quote
Old 04-15-2014, 08:27 PM   #56
bunce
Member
 
Location: Perth

Join Date: Sep 2012
Posts: 55
Default

Hi Kristen,
Just one follow-up note about primers - I have had this response a few times about cost..... but non-HPLC purified primers from IDT are very cheap..... by the time you have repeated a few runs due to contamination they are paid for (not to mention the time and headaches spent chasing contamination). Contra to what NGS companies tell you there is no need to HPLC purify primers - if you screen for exact adaptors/MIDs any 'error' at primer synthesis is discarded (and most Indexes require 3 mutations to morph it into another index). In our (sensitive) workflows re-using indexes is a false economy... cheers Mike
bunce is offline   Reply With Quote
Old 08-15-2014, 04:56 AM   #57
james hadfield
Moderator
Cambridge, UK
Community Forum
 
Location: Cambridge, UK

Join Date: Feb 2008
Posts: 221
Default

Hi, I just wrote a new post on my blog, http://core-genomics.blogspot.co.uk/...ng-bleach.html which describes the bleach protocol and how to get it.
james hadfield is offline   Reply With Quote
Old 08-21-2014, 08:33 AM   #58
Mercaptan
Junior Member
 
Location: Atlanta

Join Date: Aug 2012
Posts: 4
Default

I did a little experiment comparing read carry-over with our old regular washes (three cycle 0.5% Tween 20 wash) and Illumina's bleach wash.

Bcl2Fastq parsing of the raw data, checking for indices from prior runs, shows that the bleach wash reduces the carry-over from previous run from 0.018% (~3000 reads) to 0.00011% (23 reads). This was just comparing two runs, but that's a big reduction. We'll be keeping an eye on this in the future.

We're now doing the bleach wash as a replacement for our post-run washes. Followed by a regular three-cycle maintenance wash because we're paranoid about bleach getting into the next run.
Mercaptan is offline   Reply With Quote
Old 09-07-2017, 07:00 AM   #59
illuminaGA
Member
 
Location: Atlanta

Join Date: Dec 2012
Posts: 61
Default

Hi Harlon

Could you give me some pointers about how to detect the carryover contamination? Software, pipeline etc. Thank you in advance.


Quote:
Originally Posted by Harlon View Post
Hi All

We appear to be suffering from carry over contamination in our MiSeq runs - i.e. if we sequence a DNA sample in one MiSeq run, we see about the same sample in the subsequent run.

Measured in terms of reads, we see about 0.2% contamination run-to-run - i.e. if we see 10,000 reads of a given amplicon/barcode in one run, we'll see ~20 reads in the following run, even if that amplicon/barcode pair was absent from the prep.

Important notes about our workflow:
- Barcodes are added by PCR (we are using our own library prep, not Nextera, etc).
- We perform a post-run wash and a maintenance wash after every run.

I am quite certain that this is carry-over within the MiSeq, and that it is actually carry-over, and not simply barcode contamination within the primers. On the first run of an amplicon, it only shows up with its assigned barcode. It is also detected in the subsequent run. I'm quite certain this is also not laboratory contamination.

After speaking with Illumina, this is certainly feasible - they are aware of this issue, although they do see less run-to-run contamination than we see. They suggested that we do 2 maintenance washes between runs, which seems like a lot, and that we don't pour bleach into it, which was certainly not my plan.

Has anyone else had similar issues, and more importantly, does anyone know of any solutions?

Thanks!
Harlon
illuminaGA is offline   Reply With Quote
Reply

Tags
contamination, miseq

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 07:50 PM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2017, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO