View Single Post
Old 04-18-2017, 02:13 PM   #1
Junior Member
Location: california

Join Date: Apr 2017
Posts: 9
Default coassemble massive amount of data (>3TB)

Hello, I am attempting to assemble a massive amount of Illumina 2x150bp pair-ended reads data (>3TB). I am considering using megahit as it is the least resource-intensive assemblers I have used and still gives reasonably good results.

What are the typical strategies if one wants to assemble data size that's beyond typical limitations? I am thinking of dividing them into smaller pools but of course it's not ideal. Thanks
confurious is offline   Reply With Quote