SEQanswers (
-   Bioinformatics (
-   -   "R Killed" when working with large BAM files (

mixter 07-03-2010 05:03 AM

"R Killed" when working with large BAM files
I'm trying to work with large BAM files produced by bowtie alignment of illumina sequencing runs. I'm able to work with them fine with tools outside of R, but R seems to have some limitations or issues with the size.

Issue: After reading 4 bam files of 650 MB (maybe 1 million reads each) with the function readAligned("path","bamfile.bam,"BAM") from the ShortRead package, R simply terminates with the message:

/usr/lib64/R/bin/BATCH: line 60: 10315 Killed ${R_HOME}/bin/R -f ${in} ${opts} ${R_BATCH_OPTIONS} > ${out} 2>&1

I'm doing this on a linux-based grid engine, so it's not about the windows-specific memory limits. Also, this does not happen with small BAM files for testing. It night be about internal limits on table sizes, though?

Any help would be greatly appreciated! :)

KevinLam 07-04-2010 07:12 AM

Hi. In lieu of other error msgs. Is it a possibility that the job timed out or was killed due to system admin set limits that are common for shared resources?
Have u tried running just this command in a terminal?

ffinkernagel 07-04-2010 11:47 PM

'Killed' is sometimes induced by running out of memory - the linux schedulder than kills processes until there is enough free memory again.

How much swap space have you configured in addition to your RAM (and how much RAM is that)?

All times are GMT -8. The time now is 02:44 PM.

Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2020, vBulletin Solutions, Inc.