Go Back   SEQanswers > Bioinformatics > Bioinformatics

Similar Threads
Thread Thread Starter Forum Replies Last Post
"allele balance ratio" and "quality by depth" in VCF files efoss Bioinformatics 2 10-25-2011 11:13 AM
Relatively large proportion of "LOWDATA", "FAIL" of FPKM_status running cufflink ruben6um Bioinformatics 3 10-12-2011 12:39 AM
The position file formats ".clocs" and "_pos.txt"? Ist there any difference? elgor Illumina/Solexa 0 06-27-2011 07:55 AM
"Systems biology and administration" & "Genome generation: no engineering allowed" seb567 Bioinformatics 0 05-25-2010 12:19 PM
SEQanswers second "publication": "How to map billions of short reads onto genomes" ECO Literature Watch 0 06-29-2009 11:49 PM

Thread Tools
Old 07-03-2010, 05:03 AM   #1
Location: Munich, Germany

Join Date: May 2010
Posts: 22
Default "R Killed" when working with large BAM files

I'm trying to work with large BAM files produced by bowtie alignment of illumina sequencing runs. I'm able to work with them fine with tools outside of R, but R seems to have some limitations or issues with the size.

Issue: After reading 4 bam files of 650 MB (maybe 1 million reads each) with the function readAligned("path","bamfile.bam,"BAM") from the ShortRead package, R simply terminates with the message:

/usr/lib64/R/bin/BATCH: line 60: 10315 Killed ${R_HOME}/bin/R -f ${in} ${opts} ${R_BATCH_OPTIONS} > ${out} 2>&1

I'm doing this on a linux-based grid engine, so it's not about the windows-specific memory limits. Also, this does not happen with small BAM files for testing. It night be about internal limits on table sizes, though?

Any help would be greatly appreciated!
mixter is offline   Reply With Quote
Old 07-04-2010, 07:12 AM   #2
Senior Member
Location: SEA

Join Date: Nov 2009
Posts: 198

Hi. In lieu of other error msgs. Is it a possibility that the job timed out or was killed due to system admin set limits that are common for shared resources?
Have u tried running just this command in a terminal?
KevinLam is offline   Reply With Quote
Old 07-04-2010, 11:47 PM   #3
Senior Member
Location: Marburg, Germany

Join Date: Oct 2009
Posts: 110

'Killed' is sometimes induced by running out of memory - the linux schedulder than kills processes until there is enough free memory again.

How much swap space have you configured in addition to your RAM (and how much RAM is that)?
ffinkernagel is offline   Reply With Quote

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

All times are GMT -8. The time now is 06:34 AM.

Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2020, vBulletin Solutions, Inc.
Single Sign On provided by vBSSO