SEQanswers

Go Back   SEQanswers > Bioinformatics > Bioinformatics



Similar Threads
Thread Thread Starter Forum Replies Last Post
huge number of DEGs from cuffdiff jamal RNA Sequencing 5 12-15-2013 05:08 PM
Multiple Alignment software for huge amount of peptide sequences/cysteine framework LucasVS Bioinformatics 3 03-28-2012 07:15 PM
Analyzing Targeted Resequencing data with Galaxy LFM Genomic Resequencing 2 04-04-2011 10:45 PM
Huge NGS data storage and transferring himwo Bioinformatics 2 03-24-2011 01:32 AM
picard markduplicates on huge files rcorbett Bioinformatics 2 09-17-2010 04:39 AM

Reply
 
Thread Tools
Old 04-12-2011, 02:43 AM   #1
sklages
Senior Member
 
Location: Berlin, DE

Join Date: May 2008
Posts: 628
Default GALAXY: Huge amount of data?

Hi all,

I have a (probably) very naive question ..

We are running a few machines here as a core facility (Hiseq2000, GAIIx, 454tit,SOLiD) ... there are a couple of users interested in the capabilities of GALAXY, so I am thinking about a local install of the whole package.

My concern is the data amount. E.g. one Hiseq2000 lane with a PE whole exome lib is roughly 2x20G (unzipped) fastq. I can definitively reduce the amount by zipping the datasets. But this is still a huge amount of data to be uploaded via Browser ...
How is this working in practice if you have more than one lane?
How are the jobs scheduled?

I probably need to read more en detail before starting to set up my own installation ...

thanks for any comment ..

cheers,
Sven
sklages is offline   Reply With Quote
Old 04-12-2011, 05:03 AM   #2
maubp
Peter (Biopython etc)
 
Location: Dundee, Scotland, UK

Join Date: Jul 2009
Posts: 1,542
Default

Regarding the upload problem: Galaxy can be setup to let users upload their files by FTP, however, in your situation as a core facility you can import the files directly into Galaxy from disk. People do this as part of an automatic sequencing service pipeline - have a search on the Galaxy mailing list.

Regarding the general data volume problem: Galaxy by default keeps all the files on disk, and you can have cron jobs to clean up "deleted" datasets. Some users on the Galaxy mailing list have reported needing to be more aggressive on their server to avoid running out of space.
maubp is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 06:40 PM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2019, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO