View Single Post
Old 09-22-2021, 11:01 AM   #1
Junior Member
Location: CA

Join Date: Sep 2021
Posts: 2
Default BBDuk Out of Memory

Hello! I am running BBDuk from BBTools to remove reads that map to a mammal host genome (2.3G). I am running the script on just 1 sample (an R1 and R2; each 12G). Initially, I kept getting an "java.lang.OutOfMemoryError: Java heap space" or "java.lang.OutOfMemoryError: GC overhead limit exceeded" errors, so I started requesting 100G+ of memory for my slurm jobs. The jobs run until the time limit I set is reached (3 days), and they do not produce any output files or even temporary files. The last job I ran (1 node, 15 cores, 300G of memory) is below: in1=R1.fastq in2=R2.fastq out1=clean.R1.fastq out2=clean.R2.fastq outm1=matched1.fq outm2=matched2.fq ref=CC.genomic.fna stats=stats.txt k=31 prealloc hdist=1 rskip=4 -Xmx255g

The output I received:
Executing jgi.BBDuk
Version 38.87

Initial size set to 2142000000
36.158 seconds.
Memory: max=273804m, total=273804m, free=85629m, used=188175m

slurmstepd: error: *** JOB CANCELLED DUE TO TIME LIMIT ***

I have also tried adding the rskip=4 and qskip=4 flags to no avail. I ran the same code on just the first 1000 lines of my fastq file and also received the "out of memory" issue.

Any idea for what might be going on? Why is this task taking so long and requires so much memory for 1 sample? What I can do to diagnose the issue and actually be able to successfully run my code? I have yet to successfully run the BBDuk code on my data.
connieR is offline   Reply With Quote