SEQanswers

Go Back   SEQanswers > Bioinformatics > Bioinformatics

Similar Threads
Thread Thread Starter Forum Replies Last Post
Error with MarkDuplicates in Picard slowsmile Bioinformatics 12 09-03-2014 09:32 AM
picard error seq_GA Bioinformatics 2 07-15-2014 05:50 AM
error in picard (SAM validation error) dongshenglulv Bioinformatics 3 07-05-2013 01:58 PM
Picard FixMateInformation error? reeso123 Bioinformatics 5 06-04-2013 01:09 PM
Picard AddOrReplaceReadGroups Error ercfrtz Bioinformatics 1 03-28-2011 06:21 AM

Reply
 
Thread Tools
Old 10-27-2010, 07:28 AM   #1
JohnK
Senior Member
 
Location: Los Angeles, China.

Join Date: Feb 2010
Posts: 106
Default Picard error

Hi,

I'm having trouble understanding some of the params with Picard- well not really understanding but I'm getting this error here:

...
INFO 2010-10-26 22:45:32 MarkDuplicates Read 40000000 records. Tracking 2669417 as yet unmatched pairs. 69542 records in RAM. Last sequence index: 2
INFO 2010-10-26 22:45:47 MarkDuplicates Read 41000000 records. Tracking 2725267 as yet unmatched pairs. 47482 records in RAM. Last sequence index: 2
INFO 2010-10-26 22:46:15 MarkDuplicates Read 42000000 records. Tracking 2779116 as yet unmatched pairs. 25059 records in RAM. Last sequence index: 2
[Tue Oct 26 22:46:36 CDT 2010] net.sf.picard.sam.MarkDuplicates done.
Runtime.totalMemory()=778698752
Exception in thread "main" net.sf.picard.PicardException: Exception writing ReadEnds to file.
at net.sf.picard.sam.ReadEndsCodec.encode(ReadEndsCodec.java:74)
at net.sf.picard.sam.ReadEndsCodec.encode(ReadEndsCodec.java:32)
at net.sf.samtools.util.SortingCollection.spillToDisk(SortingCollection.java:185)
at net.sf.samtools.util.SortingCollection.add(SortingCollection.java:140)
at net.sf.picard.sam.MarkDuplicates.buildSortedReadEndLists(MarkDuplicates.java:305)
at net.sf.picard.sam.MarkDuplicates.doWork(MarkDuplicates.java:109)
at net.sf.picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:150)
at net.sf.picard.sam.MarkDuplicates.main(MarkDuplicates.java:93)
Caused by: java.io.IOException: No space left on device
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:260)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
at java.io.DataOutputStream.flush(DataOutputStream.java:106)
at net.sf.picard.sam.ReadEndsCodec.encode(ReadEndsCodec.java:71)
... 7 more
...

Has anyone ever gotten this writing read ends to file error?
JohnK is offline   Reply With Quote
Old 10-27-2010, 07:46 AM   #2
francois.sabot
Member
 
Location: France

Join Date: Dec 2009
Posts: 41
Default

Quote:
Caused by: java.io.IOException: No space left on device
No more space in the hard drive ?
__________________
Francois Sabot, PhD

Be realistic. Demand the Impossible.
www.wikiposon.org
francois.sabot is offline   Reply With Quote
Old 10-27-2010, 08:41 AM   #3
JohnK
Senior Member
 
Location: Los Angeles, China.

Join Date: Feb 2010
Posts: 106
Default

I thought that was what it would be, but is this a simple matter of modifying MAX_SEQUENCES_FOR_DISK_READ_ENDS_MAP to greater than the number of mapped reads i have in the .bam file?
JohnK is offline   Reply With Quote
Old 10-29-2010, 01:03 PM   #4
westerman
Rick Westerman
 
Location: Purdue University, Indiana, USA

Join Date: Jun 2008
Posts: 939
Default

Quote:
Originally Posted by JohnK View Post
I thought that was what it would be, but is this a simple matter of modifying MAX_SEQUENCES_FOR_DISK_READ_ENDS_MAP to greater than the number of mapped reads i have in the .bam file?
No. Your problem is *writing* to the disk. Not reading from it.
westerman is offline   Reply With Quote
Old 11-01-2010, 09:57 AM   #5
mrawlins
Member
 
Location: Retirement - Not working with bioinformatics anymore.

Join Date: Apr 2010
Posts: 63
Default

I would check whether or not your drive is full, write-protected and/or that you have permissions to write to the drive/directory in question. It mostly looks like the drive is full and you'd have to delete files off your drive.
mrawlins is offline   Reply With Quote
Old 11-02-2010, 10:24 PM   #6
JohnK
Senior Member
 
Location: Los Angeles, China.

Join Date: Feb 2010
Posts: 106
Default

Thanks, ladies/guys. My admin-dude found the issue for me and it was along the lines of what you were both saying in a way. I posted the reason on another thread.
JohnK is offline   Reply With Quote
Old 12-22-2010, 11:00 AM   #7
RockChalkJayhawk
Senior Member
 
Location: Rochester, MN

Join Date: Mar 2009
Posts: 191
Default Picard Exception

Any ideas for why it can't create a ReadEnds File?
Code:
Runtime.totalMemory()=166526976
Exception in thread "main" net.sf.picard.PicardException: Error creating temporary ReadEnds file
	at net.sf.picard.sam.DiskReadEndsMap.getOutputStreamForSequence(DiskReadEndsMap.java:175)
	at net.sf.picard.sam.DiskReadEndsMap.put(DiskReadEndsMap.java:147)
	at net.sf.picard.sam.MarkDuplicates.buildSortedReadEndLists(MarkDuplicates.java:278)
	at net.sf.picard.sam.MarkDuplicates.doWork(MarkDuplicates.java:109)
	at net.sf.picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:160)
	at net.sf.picard.sam.MarkDuplicates.main(MarkDuplicates.java:93)
Caused by: java.io.FileNotFoundException: /media/1B3B8E0F52CC3359/mRNAsnp/DREM.4603601404094682558.tmp/32279.read_ends (Too many open files)
	at java.io.FileOutputStream.open(Native Method)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:209)
	at java.io.FileOutputStream.<init>(FileOutputStream.java:160)
	at net.sf.picard.sam.DiskReadEndsMap.getOutputStreamForSequence(DiskReadEndsMap.java:170)
	... 5 more
it says there are too many open files, but I should have plenty
Code:
 cat /proc/sys/fs/file-max
197316
RockChalkJayhawk is offline   Reply With Quote
Old 12-23-2010, 02:15 AM   #8
francois.sabot
Member
 
Location: France

Join Date: Dec 2009
Posts: 41
Default

From http://lj4newbies.blogspot.com/2007/...pen-files.html (first google results for Java Too Many open Files)

Quote:
This is because too many file descriptors're opened by tomcat. File descriptor can be limited in both system level and shell level.

To check maximum number of fd in system type 'cat /proc/sys/fs/file-max'. In my case it is 65536(someone said it should set to 200000). Tomcat error when try to open socket number 272 so I think 65536 is ok for me for now. Anyway if u want to set it add 'fs.file-max = 200000' to /etc/sysctl.conf
So, the descriptors seem to be too numerous
__________________
Francois Sabot, PhD

Be realistic. Demand the Impossible.
www.wikiposon.org
francois.sabot is offline   Reply With Quote
Old 12-23-2010, 06:39 AM   #9
drio
Senior Member
 
Location: 4117'49"N / 24'42"E

Join Date: Oct 2008
Posts: 323
Default

Your friendly sysadmin will help you here. 200k may be not enough FDs. It depens on your environment. Are you sharing the box among many users?
__________________
-drd
drio is offline   Reply With Quote
Old 12-23-2010, 07:45 AM   #10
RockChalkJayhawk
Senior Member
 
Location: Rochester, MN

Join Date: Mar 2009
Posts: 191
Default

It's just me.
RockChalkJayhawk is offline   Reply With Quote
Old 03-11-2011, 06:23 AM   #11
ribozyme
Junior Member
 
Location: Beijing

Join Date: Oct 2008
Posts: 3
Default

Thank you all!

But it seems i dont have the right to change the /etc/sysctl.conf, any other solutions?

thanks.
ribozyme is offline   Reply With Quote
Old 09-14-2011, 12:20 PM   #12
tatinhawk
Martin
 
Location: Cambridge

Join Date: Dec 2009
Posts: 11
Default

Hi everyone,

I have the same problem when trying to run SamSort.jar for a 33GB BAM, I asked my sys-admin to expand the file-max limit to 8,000,000 and.. I still got the same error. So there maybe something else...

Thanks.
tatinhawk is offline   Reply With Quote
Old 09-14-2011, 12:34 PM   #13
westerman
Rick Westerman
 
Location: Purdue University, Indiana, USA

Join Date: Jun 2008
Posts: 939
Default

tatinhawk: Given that this thread is around 9 months old and has talked about at least two different type of errors, what "same error" are you referring to?
westerman is offline   Reply With Quote
Old 09-14-2011, 02:58 PM   #14
tatinhawk
Martin
 
Location: Cambridge

Join Date: Dec 2009
Posts: 11
Default

Quote:
Originally Posted by tatinhawk View Post
Hi everyone,

I have the same problem when trying to run SamSort.jar for a 33GB BAM, I asked my sys-admin to expand the file-max limit to 8,000,000 and.. I still got the same error. So there maybe something else...

Thanks.
sorry ... I forgot to mention that the error I have is the " Exception in thread "main" net.sf.samtools.util.RuntimeIOException: java.io.FileNotFoundException: /temp/sortingcollection.1760628862445939889.tmp (Too many open files)"
tatinhawk is offline   Reply With Quote
Old 12-14-2012, 06:41 AM   #15
ngcrawford
Member
 
Location: Boston MA

Join Date: Jun 2009
Posts: 13
Default

On my university cluster the max number of open files is set to 1024, and I was getting the same error reported above. Setting the following picard flag to something less than that max seems to resolve the problem.

MAX_FILE_HANDLES_FOR_READ_ENDS_MAP=[some number lower than the output of `ulimit -n`]
ngcrawford is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off




All times are GMT -8. The time now is 01:02 PM.


Powered by vBulletin® Version 3.8.6
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.