View Single Post
Old 07-03-2010, 05:03 AM   #1
mixter
Member
 
Location: Munich, Germany

Join Date: May 2010
Posts: 22
Default "R Killed" when working with large BAM files

I'm trying to work with large BAM files produced by bowtie alignment of illumina sequencing runs. I'm able to work with them fine with tools outside of R, but R seems to have some limitations or issues with the size.

Issue: After reading 4 bam files of 650 MB (maybe 1 million reads each) with the function readAligned("path","bamfile.bam,"BAM") from the ShortRead package, R simply terminates with the message:

/usr/lib64/R/bin/BATCH: line 60: 10315 Killed ${R_HOME}/bin/R -f ${in} ${opts} ${R_BATCH_OPTIONS} > ${out} 2>&1

I'm doing this on a linux-based grid engine, so it's not about the windows-specific memory limits. Also, this does not happen with small BAM files for testing. It night be about internal limits on table sizes, though?

Any help would be greatly appreciated!
mixter is offline   Reply With Quote