SEQanswers

Go Back   SEQanswers > Bioinformatics > Bioinformatics



Similar Threads
Thread Thread Starter Forum Replies Last Post
Weird file compression harrike Bioinformatics 0 04-01-2011 06:29 PM
ChIP-Seq: Data structures and compression algorithms for high-throughput sequencing t Newsbot! Literature Watch 0 10-16-2010 02:00 AM
Fastq compression - proof of concept jkbonfield Bioinformatics 6 08-10-2010 03:12 PM
wig file compression anna_vt Bioinformatics 2 02-19-2010 11:28 AM
Tag (string) Compression Techniques foolishbrat Bioinformatics 2 01-09-2009 08:32 PM

Reply
 
Thread Tools
Old 01-29-2010, 10:34 AM   #1
darked89
Member
 
Location: Barcelona, Spain

Join Date: Jun 2009
Posts: 36
Question Fastq data compression with pigz

Hi,

so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
of gzip called pigz:
http://zlib.net/pigz/

written by the same guy who wrote zlib and gzip itself:
http://en.wikipedia.org/wiki/Mark_Adler

It is about to 3.8x faster on 4 cores machine than gzip.

Before falling for it head first, simple question: does anybody used it for some time without problems?

Best,

Darek
darked89 is offline   Reply With Quote
Old 01-29-2010, 12:14 PM   #2
nilshomer
Nils Homer
 
nilshomer's Avatar
 
Location: Boston, MA, USA

Join Date: Nov 2008
Posts: 1,285
Default

Quote:
Originally Posted by darked89 View Post
Hi,

so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
of gzip called pigz:
http://zlib.net/pigz/

written by the same guy who wrote zlib and gzip itself:
http://en.wikipedia.org/wiki/Mark_Adler

It is about to 3.8x faster on 4 cores machine than gzip.

Before falling for it head first, simple question: does anybody used it for some time without problems?

Best,

Darek
Note that there is no speed up upon decompression. For archiving, we use pbzip2 which runs with multiple threads when compressing and decompressing. The only issue is that some programs (i.e. aligners) support reading directly from gzip or bzip2 compressed files. In this case, it is much faster (in the aligner code) to decompress gzip files since the reading is typically not multi-threaded.
nilshomer is offline   Reply With Quote
Old 02-01-2010, 10:19 AM   #3
darked89
Member
 
Location: Barcelona, Spain

Join Date: Jun 2009
Posts: 36
Default

Quote:
Originally Posted by nilshomer View Post
Note that there is no speed up upon decompression. For archiving, we use pbzip2 which runs with multiple threads when compressing and decompressing. The only issue is that some programs (i.e. aligners) support reading directly from gzip or bzip2 compressed files. In this case, it is much faster (in the aligner code) to decompress gzip files since the reading is typically not multi-threaded.
Thank you. It is always good to know that others are using non-standard compressors and do not run into problems because of that.

Once I get back to my office in a week or so I will try to do some benchmarking of gzip/bzip2 in single/multi-threded versions. I was also thinking about adding two other algorithms:

lzop (the very fast one) http://en.wikipedia.org/wiki/Lzop
xz (faster than (single-threaded) bzip2 and producing smaller files) http://tukaani.org/xz/

It will be good to create a list of programs able to use various compressed formats. I am aware of ABYSS able to use gz, bz2 or xz, and there should be few others happy with fastq.gz.

Darek
darked89 is offline   Reply With Quote
Old 02-01-2010, 11:47 AM   #4
lh3
Senior Member
 
Location: Boston

Join Date: Feb 2008
Posts: 693
Default

There are several very good benchmarks on compression algorithms with various input. You can google them. Personally, I prefer gzip because:

1) It is open source and has good APIs (as well as bzip2/lzop/xz).

2) It is widely available (as well as bzip2).

3) It achieves reasonable compression ratio (as well as bzip2).

4) It is very fast on decompression, several times faster than bzip2.

The last point makes gzip much more practical than bzip2 because we usually compress once but decompress many times. A few people asked me why BAM is gzip'ed but not bzip2'ed which has better compression ratio. The last point is the leading reason.
lh3 is offline   Reply With Quote
Old 02-02-2010, 05:53 PM   #5
Torst
Senior Member
 
Location: The University of Melbourne, AUSTRALIA

Join Date: Apr 2008
Posts: 275
Default

Quote:
Originally Posted by darked89 View Post
so far we compress a large number of giant fastq files using gzip. I just come upon this parallel version
of gzip called pigz:
http://zlib.net/pigz/
It is about to 3.8x faster on 4 cores machine than gzip.
Before falling for it head first, simple question: does anybody used it for some time without problems?
Yes, I've been using it for about a year for compression tasks eg. compressing illumina fastq files. On our 8 core SMP system it is about 7.6x faster - GZIP definitely seems CPU bound rather than I/O bound.
Torst is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 12:29 AM.


Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2020, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO