I am having problems using SSPACE basic with my 454 paired-end data, and was hoping to get some help here. SSPACE runs fine using my Illumina PE data, but my 454-data has much longer insert-sizes (3-5 kb), and I think they really could make difference.
My problem is that SSPACE reads all the 454-pairs in, removes quite a lot of them as the include Ns, and then maps 0 of them. The report is below. It was difficult to get the reads in a format that SSPACE accepts, and I guess that the problem lies in the fastq-files. Some (very few) reads are too long (over 1024 bases), and bowtie complains about these. Would this crash the whole run? I know that bowtie is not the best choice for longer reads, but I thought it would still manage to map some reads? Is SSPACE premium the answer?
Any/all help would be much appreciated,
Henrik
READING READS Lib454:
------------------------------------------------------------
Total inserted pairs = 1217215
Number of pairs containing N's = 1066178
Remaining pairs = 151037
------------------------------------------------------------
...
LIBRARY Lib454 STATS:
################################################################################
MAPPING READS TO CONTIGS:
------------------------------------------------------------
Number of single reads found on contigs = 0
Number of pairs used for pairing contigs / total pairs = 0 / 0
------------------------------------------------------------
READ PAIRS STATS:
Assembled pairs: 0 (0 sequences)
Satisfied in distance/logic within contigs (i.e. -> <-, distance on target: 3709 +/-927.25): 0
Unsatisfied in distance within contigs (i.e. distance out-of-bounds): 0
Unsatisfied pairing logic within contigs (i.e. illogical pairing ->->, <-<- or <-->): 0
---
Satisfied in distance/logic within a given contig pair (pre-scaffold): 0
Unsatisfied in distance within a given contig pair (i.e. calculated distances out-of-bounds): 0
---
Total satisfied: 0 unsatisfied: 0
Estimated insert size statistics (based on 0 pairs):
Mean insert size = 0
Median insert size = 0
REPEATS:
Number of repeated edges = 0
------------------------------------------------------------
################################################################################
My problem is that SSPACE reads all the 454-pairs in, removes quite a lot of them as the include Ns, and then maps 0 of them. The report is below. It was difficult to get the reads in a format that SSPACE accepts, and I guess that the problem lies in the fastq-files. Some (very few) reads are too long (over 1024 bases), and bowtie complains about these. Would this crash the whole run? I know that bowtie is not the best choice for longer reads, but I thought it would still manage to map some reads? Is SSPACE premium the answer?
Any/all help would be much appreciated,
Henrik
READING READS Lib454:
------------------------------------------------------------
Total inserted pairs = 1217215
Number of pairs containing N's = 1066178
Remaining pairs = 151037
------------------------------------------------------------
...
LIBRARY Lib454 STATS:
################################################################################
MAPPING READS TO CONTIGS:
------------------------------------------------------------
Number of single reads found on contigs = 0
Number of pairs used for pairing contigs / total pairs = 0 / 0
------------------------------------------------------------
READ PAIRS STATS:
Assembled pairs: 0 (0 sequences)
Satisfied in distance/logic within contigs (i.e. -> <-, distance on target: 3709 +/-927.25): 0
Unsatisfied in distance within contigs (i.e. distance out-of-bounds): 0
Unsatisfied pairing logic within contigs (i.e. illogical pairing ->->, <-<- or <-->): 0
---
Satisfied in distance/logic within a given contig pair (pre-scaffold): 0
Unsatisfied in distance within a given contig pair (i.e. calculated distances out-of-bounds): 0
---
Total satisfied: 0 unsatisfied: 0
Estimated insert size statistics (based on 0 pairs):
Mean insert size = 0
Median insert size = 0
REPEATS:
Number of repeated edges = 0
------------------------------------------------------------
################################################################################
Comment