Seqanswers Leaderboard Ad

Collapse

Announcement

Collapse
No announcement yet.
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • 3-5% more seq from new Roche software

    We have reprocessed a few older runs using the new "Phase C" aka "v. 2.3" Roche Data Processing software that was released last week. This involves redoing image analysis/basecalling--so it is CPU-intensive. But we see more favorable results in 454BaseCallerMetrics.txt in every case we have tried. We have seen anywhere from 3% to just over 5% more totalBases.

    Anyone else tried to reprocess older runs with the new software? If so, did you see improvements?

    --
    Phillip

    PS The major improvement is that the GSAssembler (aka "Newbler") is now "cDNA-aware". EST assemblies work much better!

  • #2
    Yeah we also tried a few older runs this week and decided to reanalyze everything because of the improvements. We only tried DNA for now.

    I think there's also some improvements in the HCDiffs detection, because we seem to pick up more differences than can be explained by the additional sequence from the improved signalprocessing and mapping

    Comment


    • #3
      Originally posted by Tuxido View Post
      Yeah we also tried a few older runs this week and decided to reanalyze everything because of the improvements. We only tried DNA for now.

      I think there's also some improvements in the HCDiffs detection, because we seem to pick up more differences than can be explained by the additional sequence from the improved signalprocessing and mapping
      But those are real differences, not false positives? I ask, because one way to produce more "high quality" sequence is to tune a base caller to be more "confident" without actually being any more accurate.

      I'm probably just being paranoid. More likely the alignment software was improved...

      --
      Phillip

      Comment


      • #4
        We haven't validated yet. We still need to check these differences. We looked up a few in the alignment browser and these seemed to look ok. We 've also already seen some really interesting variants being detected. Changes from "-" to "-" for example. Not really helpful

        Comment


        • #5
          Originally posted by Tuxido View Post
          We 've also already seen some really interesting variants being detected. Changes from "-" to "-" for example. Not really helpful
          Ahh, a zen koan variant. Sadly, those who reach enlightenment through its agency may not be disposed to post here.

          --
          Phillip

          Comment


          • #6
            I still have to install the new versions of the software, so I cannot say what the cause of the 3-5% is.

            But have you checked the standard filter values compared to the older software pipeline? I know we get +50% +100% more reads when tweaking the filter settings (especially in 'exotic' experiments such as bisulphite, chip-seq or pull-down (ChIP) experiments). And most of those additional reads do map and are valid.

            Nevertheless, I am quite suspicious towards improved efficiency software-based for standard experiments, especially because I got a 'hint' from a Roche tech support guy to change the value of a filter from 0.05 to 0.10 as default value, which indeed improved reads passing the filters some %s.

            When it appears to be a case of filter settings, i'd recommend you to just refilter your experiments. There is a chapter on it in the manual and it takes only 10 mins or so instead of going through the whole pipeline.

            Comment


            • #7
              Originally posted by joa_ds View Post
              I still have to install the new versions of the software, so I cannot say what the cause of the 3-5% is.

              But have you checked the standard filter values compared to the older software pipeline? I know we get +50% +100% more reads when tweaking the filter settings (especially in 'exotic' experiments such as bisulphite, chip-seq or pull-down (ChIP) experiments). And most of those additional reads do map and are valid.

              Nevertheless, I am quite suspicious towards improved efficiency software-based for standard experiments, especially because I got a 'hint' from a Roche tech support guy to change the value of a filter from 0.05 to 0.10 as default value, which indeed improved reads passing the filters some %s.

              When it appears to be a case of filter settings, i'd recommend you to just refilter your experiments. There is a chapter on it in the manual and it takes only 10 mins or so instead of going through the whole pipeline.
              Which Roche program and which parameters are you using to do this refiltering?

              Comment


              • #8
                I am currently using "gsRunProcessor 2.0.01.12".

                For refiltering I follow the steps provided in the original manual @ page 56 and further.

                Little summary of those pages:

                gsRunProcessor --template=filterOnly > filterfile.xml (change the values in this xml file to your desired values)

                And then to refilter:

                runAnalysisFilter --pipe=filterfile.xml <your_D_folder_to_reanalyse>

                Comment

                Latest Articles

                Collapse

                • seqadmin
                  Strategies for Sequencing Challenging Samples
                  by seqadmin


                  Despite advancements in sequencing platforms and related sample preparation technologies, certain sample types continue to present significant challenges that can compromise sequencing results. Pedro Echave, Senior Manager of the Global Business Segment at Revvity, explained that the success of a sequencing experiment ultimately depends on the amount and integrity of the nucleic acid template (RNA or DNA) obtained from a sample. “The better the quality of the nucleic acid isolated...
                  03-22-2024, 06:39 AM
                • seqadmin
                  Techniques and Challenges in Conservation Genomics
                  by seqadmin



                  The field of conservation genomics centers on applying genomics technologies in support of conservation efforts and the preservation of biodiversity. This article features interviews with two researchers who showcase their innovative work and highlight the current state and future of conservation genomics.

                  Avian Conservation
                  Matthew DeSaix, a recent doctoral graduate from Kristen Ruegg’s lab at The University of Colorado, shared that most of his research...
                  03-08-2024, 10:41 AM

                ad_right_rmr

                Collapse

                News

                Collapse

                Topics Statistics Last Post
                Started by seqadmin, Yesterday, 06:37 PM
                0 responses
                8 views
                0 likes
                Last Post seqadmin  
                Started by seqadmin, Yesterday, 06:07 PM
                0 responses
                8 views
                0 likes
                Last Post seqadmin  
                Started by seqadmin, 03-22-2024, 10:03 AM
                0 responses
                49 views
                0 likes
                Last Post seqadmin  
                Started by seqadmin, 03-21-2024, 07:32 AM
                0 responses
                67 views
                0 likes
                Last Post seqadmin  
                Working...
                X