Hello, I am attempting to assemble a massive amount of Illumina 2x150bp pair-ended reads data (>3TB). I am considering using megahit as it is the least resource-intensive assemblers I have used and still gives reasonably good results.
What are the typical strategies if one wants to assemble data size that's beyond typical limitations? I am thinking of dividing them into smaller pools but of course it's not ideal. Thanks
What are the typical strategies if one wants to assemble data size that's beyond typical limitations? I am thinking of dividing them into smaller pools but of course it's not ideal. Thanks
Comment