You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dear developer:
I used 29 genomes to get the vcf file using the Minigraph-Cactus pipeline, and now I want to do some pan-transcriptome analysis, so I need to convert the required file.
there is still enough storage, but the task automatically terminates after running for one day, I would like to ask how to solve this situation?
l only not obtain the sample.trans.spliced.gcsa and sample.trans.spliced.gcsa.lcp, other files are ok.
Best
Dong
Code:
vg autoindex --threads 32 --workflow mpmap --workflow rpvg --prefix sample.trans --ref-fasta reference.fa --vcf sample.result.vcf.gz --tx-gff Duroc.111.chr1-18.gtf --tmp-dir /home/test/nvmedata2/02.Pantrans/TMP -M 850G
erro:
warning:[vg::Constructor] Skipping duplicate variant with hash c5757e8eca1e42a9bafd6bf1aed0bacad2826367 at 1:274146418
[IndexRegistry]: Constructing GBWT from spliced VG graph and phased VCF input.
[IndexRegistry]: Merging contig GBWTs.
[IndexRegistry]: Stripping allele paths from spliced VG.
[IndexRegistry]: Constructing haplotype-transcript GBWT and finishing spliced VG.
[IndexRegistry]: Merging contig GBWTs.
[IndexRegistry]: Joining transcript origin table.
[IndexRegistry]: Constructing spliced XG graph from spliced VG graph.
[IndexRegistry]: Constructing distance index for a spliced graph.
[IndexRegistry]: Pruning complex regions of spliced VG to prepare for GCSA indexing with GBWT unfolding.
[IndexRegistry]: Constructing GCSA/LCP indexes.
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.002 GB) exceeds memory limit (850 GB)
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.045 GB) exceeds memory limit (850 GB)
[IndexRegistry]: Exceeded disk or memory use limit while performing k-mer doubling steps. Rewinding to pruning step with more aggressive pruning to simplify the graph.
[IndexRegistry]: Pruning complex regions of spliced VG to prepare for GCSA indexing with GBWT unfolding.
[IndexRegistry]: Constructing GCSA/LCP indexes.
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.017 GB) exceeds memory limit (850 GB)
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.06 GB) exceeds memory limit (850 GB)
[IndexRegistry]: Exceeded disk or memory use limit while performing k-mer doubling steps. Rewinding to pruning step with more aggressive pruning to simplify the graph.
[IndexRegistry]: Pruning complex regions of spliced VG to prepare for GCSA indexing with GBWT unfolding.
[IndexRegistry]: Constructing GCSA/LCP indexes.
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.039 GB) exceeds memory limit (850 GB)
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.082 GB) exceeds memory limit (850 GB)
[IndexRegistry]: Exceeded disk or memory use limit while performing k-mer doubling steps. Rewinding to pruning step with more aggressive pruning to simplify the graph.
[IndexRegistry]: Pruning complex regions of spliced VG to prepare for GCSA indexing with GBWT unfolding.
[IndexRegistry]: Constructing GCSA/LCP indexes.
DiskIO::write(): Write failed
DiskIO::write(): You may have run out of temporary disk space at /home/test/nvmedata2/02.Pantrans/TMP
[IndexRegistry]: Unrecoverable error in GCSA2 indexing.
The text was updated successfully, but these errors were encountered:
Dear developer:
I used 29 genomes to get the vcf file using the Minigraph-Cactus pipeline, and now I want to do some pan-transcriptome analysis, so I need to convert the required file.
there is still enough storage, but the task automatically terminates after running for one day, I would like to ask how to solve this situation?
l only not obtain the sample.trans.spliced.gcsa and sample.trans.spliced.gcsa.lcp, other files are ok.
Best
Dong
Code:
vg autoindex --threads 32 --workflow mpmap --workflow rpvg --prefix sample.trans --ref-fasta reference.fa --vcf sample.result.vcf.gz --tx-gff Duroc.111.chr1-18.gtf --tmp-dir /home/test/nvmedata2/02.Pantrans/TMP -M 850G
erro:
warning:[vg::Constructor] Skipping duplicate variant with hash c5757e8eca1e42a9bafd6bf1aed0bacad2826367 at 1:274146418
[IndexRegistry]: Constructing GBWT from spliced VG graph and phased VCF input.
[IndexRegistry]: Merging contig GBWTs.
[IndexRegistry]: Stripping allele paths from spliced VG.
[IndexRegistry]: Constructing haplotype-transcript GBWT and finishing spliced VG.
[IndexRegistry]: Merging contig GBWTs.
[IndexRegistry]: Joining transcript origin table.
[IndexRegistry]: Constructing spliced XG graph from spliced VG graph.
[IndexRegistry]: Constructing distance index for a spliced graph.
[IndexRegistry]: Pruning complex regions of spliced VG to prepare for GCSA indexing with GBWT unfolding.
[IndexRegistry]: Constructing GCSA/LCP indexes.
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.002 GB) exceeds memory limit (850 GB)
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.045 GB) exceeds memory limit (850 GB)
[IndexRegistry]: Exceeded disk or memory use limit while performing k-mer doubling steps. Rewinding to pruning step with more aggressive pruning to simplify the graph.
[IndexRegistry]: Pruning complex regions of spliced VG to prepare for GCSA indexing with GBWT unfolding.
[IndexRegistry]: Constructing GCSA/LCP indexes.
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.017 GB) exceeds memory limit (850 GB)
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.06 GB) exceeds memory limit (850 GB)
[IndexRegistry]: Exceeded disk or memory use limit while performing k-mer doubling steps. Rewinding to pruning step with more aggressive pruning to simplify the graph.
[IndexRegistry]: Pruning complex regions of spliced VG to prepare for GCSA indexing with GBWT unfolding.
[IndexRegistry]: Constructing GCSA/LCP indexes.
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.039 GB) exceeds memory limit (850 GB)
PathGraphBuilder::write(): Memory use of file 0 of kmer paths (850.082 GB) exceeds memory limit (850 GB)
[IndexRegistry]: Exceeded disk or memory use limit while performing k-mer doubling steps. Rewinding to pruning step with more aggressive pruning to simplify the graph.
[IndexRegistry]: Pruning complex regions of spliced VG to prepare for GCSA indexing with GBWT unfolding.
[IndexRegistry]: Constructing GCSA/LCP indexes.
DiskIO::write(): Write failed
DiskIO::write(): You may have run out of temporary disk space at /home/test/nvmedata2/02.Pantrans/TMP
[IndexRegistry]: Unrecoverable error in GCSA2 indexing.
The text was updated successfully, but these errors were encountered: