Skip to content

Conversation

Delphine-L
Copy link
Contributor

@Delphine-L Delphine-L commented Aug 29, 2025

FOR CONTRIBUTOR:

  • I have read the Adding workflows guidelines
  • License permits unrestricted use (educational + commercial)
  • Please also take note of the reviewer guidelines below to facilitate a smooth review process.

FOR REVIEWERS:

  • .dockstore.yml: file is present and aligned with creator metadata in workflow. ORCID identifiers are strongly encouraged in creator metadata. The .dockstore.yml file is required to run tests
  • Workflow is sufficiently generic to be used with lab data and does not hardcode sample names, reference data and can be run without reading an accompanying tutorial.
  • In workflow: annotation field contains short description of what the workflow does. Should start with This workflow does/runs/performs … xyz … to generate/analyze/etc …
  • In workflow: workflow inputs and outputs have human readable names (spaces are fine, no underscore, dash only where spelling dictates it), no abbreviation unless it is generally understood. Altering input or output labels requires adjusting these labels in the the workflow-tests.yml file as well
  • In workflow: name field should be human readable (spaces are fine, no underscore, dash only where spelling dictates it), no abbreviation unless generally understood
  • Workflow folder: prefer dash (-) over underscore (_), prefer all lowercase. Folder becomes repository in iwc-workflows organization and is included in TRS id
  • Readme explains what workflow does, what are valid inputs and what outputs users can expect. If a tutorial or other resources exist they can be linked. If a similar workflow exists in IWC readme should explain differences with existing workflow and when one might prefer one workflow over another
  • Changelog contains appropriate entries
  • Large files (> 100 KB) are uploaded to zenodo and location urls are used in test file

Copy link

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 1
Passed 0
Error 1
Failure 0
Skipped 0
Errored Tests
  • ❌ Assembly-decontamination-VGP9.ga_0

    Workflow invocation details

    • Invocation Messages

    • Steps
      • Step 1: Scaffolded assembly (fasta):

        • step_state: scheduled
      • Step 2: Taxonomic Identifier:

        • step_state: scheduled
      • Step 11: hard-masking:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • sed --sandbox -r -f '/tmp/tmp6wl2lhgi/job_working_directory/000/18/configs/tmp407n921e' '/tmp/tmp6wl2lhgi/files/8/d/9/dataset_8d966592-173f-41b0-8a60-3ee9f0e1801a.dat' > '/tmp/tmp6wl2lhgi/job_working_directory/000/18/outputs/dataset_66407907-dcdf-42a2-8cf6-03c3aff41b76.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "3e8a1d8c851611f0ac69000d3a339fc8"
              adv_opts {"__current_case__": 0, "adv_opts_selector": "basic"}
              chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              code "/^>/!y/atcgn/NNNNN/"
              dbkey "?"
      • Step 12: toolshed.g2.bx.psu.edu/repos/iuc/ncbi_fcs_gx/ncbi_fcs_gx/0.5.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "3e8a1d8c851611f0ac69000d3a339fc8"
              chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode {"__current_case__": 1, "action_report": {"values": [{"id": 20, "src": "hda"}]}, "input": {"values": [{"id": 17, "src": "hda"}]}, "min_seq_len": "200", "mode_selector": "clean"}
      • Step 13: blast mitochondria DB:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • blastn  -query '/tmp/tmp6wl2lhgi/files/6/6/4/dataset_66407907-dcdf-42a2-8cf6-03c3aff41b76.dat'   -db '"/cvmfs/data.galaxyproject.org/byhand/refseq/mitochondrion/genomic/2022-03-10/mitochondrion"'  -task 'blastn' -evalue '0.001' -out '/tmp/tmp6wl2lhgi/job_working_directory/000/20/outputs/dataset_6b8389d8-e53e-480c-80e2-54970bf3150d.dat' -outfmt '6 qseqid sseqid length qstart qend evalue qlen qcovs qcovhsp'  -num_threads "${GALAXY_SLOTS:-8}"

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "3e8a1d8c851611f0ac69000d3a339fc8"
              adv_opts {"__current_case__": 0, "adv_opts_selector": "basic"}
              blast_type "blastn"
              chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              db_opts {"__current_case__": 0, "database": ["refseq_mitochondrion"], "db_opts_selector": "db", "histdb": "", "subject": ""}
              dbkey "?"
              evalue_cutoff "0.001"
              output {"__current_case__": 2, "ext_cols": ["qlen"], "ids_cols": null, "misc_cols": ["qcovs", "qcovhsp"], "out_format": "cols", "std_cols": ["qseqid", "sseqid", "length", "qstart", "qend", "evalue"], "tax_cols": null}
      • Step 14: parsing blast output:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • parse_mito_blast.py --blastout '/tmp/tmp6wl2lhgi/files/6/b/8/dataset_6b8389d8-e53e-480c-80e2-54970bf3150d.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "3e8a1d8c851611f0ac69000d3a339fc8"
              chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
      • **Step 15: removing scaffolds **:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "3e8a1d8c851611f0ac69000d3a339fc8"
              chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode_condition {"__current_case__": 0, "discover_paths": false, "homopolymer_compress": null, "output_condition": {"__current_case__": 1, "line_length": null, "out_format": "fasta.gz"}, "remove_terminal_gaps": true, "selector": "manipulation", "sort": "", "swiss_army_knife": null}
              target_condition {"__current_case__": 1, "exclude_bed": {"values": [{"id": 26, "src": "hda"}]}, "include_bed": null, "target_option": "true", "target_sequence": ""}
      • Step 3: Species Binomial Name:

        • step_state: scheduled
      • Step 4: Maximum length of sequence to consider for mitochondrial scaffolds:

        • step_state: scheduled
      • Step 5: toolshed.g2.bx.psu.edu/repos/richard-burhans/ncbi_fcs_adaptor/ncbi_fcs_adaptor/0.5.0+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • /app/fcs/bin/av_screen_x -o "$(pwd)" '--euk' '/tmp/tmp6wl2lhgi/files/5/a/3/dataset_5a34ee9e-a82b-4aad-b2c5-f93ccbd8e4c3.dat'

            Exit Code:

            • 0

            Standard Error:

            • Resolved '/app/fcs/progs/ForeignContaminationScreening.cwl' to 'file:///app/fcs/progs/ForeignContaminationScreening.cwl'
              [workflow ] start
              [workflow ] starting step ValidateInputSequences
              [step ValidateInputSequences] start
              [job ValidateInputSequences] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/5kh3i9ql$ validate_fasta \
                  --jsonl \
                  validate_fasta.log \
                  --fasta-output \
                  validated.fna \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/ae3mix2e/stg2735be8b-d440-4cf7-b4f1-80732477bc80/dataset_5a34ee9e-a82b-4aad-b2c5-f93ccbd8e4c3.dat > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/5kh3i9ql/validate_fasta.txt
              [job ValidateInputSequences] Max memory used: 40MiB
              [job ValidateInputSequences] completed success
              [step ValidateInputSequences] completed success
              [workflow ] starting step parallel_section
              [step parallel_section] start
              [workflow parallel_section] start
              [workflow parallel_section] starting step SplitInputSequences
              [step SplitInputSequences] start
              [workflow SplitInputSequences] start
              [workflow SplitInputSequences] starting step fasta_split
              [step fasta_split] start
              [job fasta_split] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/u_23kjky$ fasta_split \
                  -in_file \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/t26gdf33/stged0b1cf4-4f12-4bbb-8592-7630cf4b5c6d/validated.fna_0.fna \
                  -out_file \
                  split_fasta.fna \
                  -logfile \
                  fast_split.log \
                  -mapping_json \
                  seq_mapping.jsonl
              protobuf arena allocated space: 10000000, used: 83432
              [job fasta_split] completed success
              [step fasta_split] completed success
              [workflow SplitInputSequences] starting step log
              [step log] start
              [job log] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/uwc7l3xf$ cxxlog2pb \
                  --stage \
                  SplitInputSequences < /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/25jceyf_/stg8a358bec-e277-49da-ba56-39b2fa517172/fast_split.log > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/uwc7l3xf/45ec615881c92e938694dd3ccaea7a9c03748313
              [job log] Max memory used: 39MiB
              [job log] completed success
              [step log] completed success
              [workflow SplitInputSequences] completed success
              [step SplitInputSequences] completed success
              [workflow parallel_section] starting step AdaptorScreeningAndFilterResults
              [step AdaptorScreeningAndFilterResults] start
              [workflow AdaptorScreeningAndFilterResults] start
              [workflow AdaptorScreeningAndFilterResults] starting step blast
              [step blast] start
              [job blast] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/us_9278m$ vecscreen \
                  -db \
                  adaptors_for_euks \
                  -logfile \
                  vecscreen.log \
                  -out \
                  vs_unfiltered.hit \
                  -query \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/9bjf0xru/stg5fb6bf8a-4e8e-4645-bd9d-c3f8c347fd1d/split_fasta.fna \
                  -term-flex \
                  25
              [job blast] Max memory used: 56MiB
              [job blast] completed success
              [step blast] completed success
              [workflow AdaptorScreeningAndFilterResults] starting step log_2
              [step log_2] start
              [job log_2] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/i1xxqyxu$ cxxlog2pb \
                  --stage \
                  AdaptorScreening < /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/gxcq7aba/stgff44dbc4-2da1-4538-8bd1-5a2e83f0c6ad/vecscreen.log > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/i1xxqyxu/45ec615881c92e938694dd3ccaea7a9c03748313
              [job log_2] Max memory used: 39MiB
              [job log_2] completed success
              [step log_2] completed success
              [workflow AdaptorScreeningAndFilterResults] starting step filter
              [step filter] start
              [job filter] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/pquicgn2$ vecscreen_filter \
                  --filtered \
                  vs_filtered.jsonl \
                  --unfiltered \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/qe90l2lk/stge1507161-a2af-4bd9-9062-e5d1573ab417/vs_unfiltered.hit
              [job filter] Max memory used: 16MiB
              [job filter] completed success
              [step filter] completed success
              [workflow AdaptorScreeningAndFilterResults] completed success
              [step AdaptorScreeningAndFilterResults] completed success
              [workflow parallel_section] starting step ApplyHeuristicsToMakeExcludeAndTrimCalls
              [step ApplyHeuristicsToMakeExcludeAndTrimCalls] start
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] start
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] starting step make_calls
              [step make_calls] start
              [job make_calls] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/dvj46w_5$ make_calls \
                  -a \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/0f7fkmzr/stg8c86f625-4445-4535-bc04-2b9327ff44cb/vs_filtered.jsonl \
                  -logfile \
                  make_calls.log \
                  -seq-len \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/0f7fkmzr/stg2089eefe-719a-4095-bc05-4ccd203846b9/seq_mapping.jsonl > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/dvj46w_5/combined.calls.jsonl
              [job make_calls] completed success
              [step make_calls] completed success
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] starting step log_3
              [step log_3] start
              [job log_3] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/jfj2uab5$ cxxlog2pb \
                  --stage \
                  ApplyHeuristicsToMakeExcludeAndTrimCalls < /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/4jfwm1yf/stg498ad51f-6b60-413d-94ac-4037b3436995/make_calls.log > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/jfj2uab5/45ec615881c92e938694dd3ccaea7a9c03748313
              [job log_3] Max memory used: 39MiB
              [job log_3] completed success
              [step log_3] completed success
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] completed success
              [step ApplyHeuristicsToMakeExcludeAndTrimCalls] completed success
              [workflow parallel_section] starting step log_merging
              [step log_merging] start
              [job log_merging] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/ebnixilz$ cat \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/l6iups9t/stgaeada80b-6976-4ad2-b4d8-104739d85490/45ec615881c92e938694dd3ccaea7a9c03748313 \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/l6iups9t/stg3dd9cff9-1ff0-4f27-afd6-cdaeb131c1fb/45ec615881c92e938694dd3ccaea7a9c03748313 \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/l6iups9t/stg285479c4-29e0-4dd4-bd02-341f02d2c091/45ec615881c92e938694dd3ccaea7a9c03748313 > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/ebnixilz/par_sec.log
              [job log_merging] completed success
              [step log_merging] completed success
              [workflow parallel_section] completed success
              [step parallel_section] completed success
              [workflow ] starting step seq_mapping
              [step seq_mapping] start
              [job seq_mapping] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/n0w5owy8$ cat \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/c3sefww5/stg817db655-e4e8-483a-ab21-fcfafee1c097/seq_mapping.jsonl > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/n0w5owy8/seq_mapping.jsonl
              [job seq_mapping] completed success
              [step seq_mapping] completed success
              [workflow ] starting step gather_logs
              [step gather_logs] start
              [job gather_logs] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/y85xsjvs$ cat \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/i3h4cbxp/stg3a0fda6c-889f-4fb1-9319-1351ef9169cf/par_sec.log > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/y85xsjvs/par_sec_logs.log
              [job gather_logs] completed success
              [step gather_logs] completed success
              [workflow ] starting step adaptor_calls
              [step adaptor_calls] start
              [job adaptor_calls] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/t_4xsrbk$ cat \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/p14mry7e/stg18780c80-9b8a-475b-8e4f-c27cbafad1aa/combined.calls.jsonl > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/t_4xsrbk/adaptor_calls.jsonl
              [job adaptor_calls] completed success
              [step adaptor_calls] completed success
              [workflow ] starting step post_processor
              [step post_processor] start
              [workflow post_processor] start
              [workflow post_processor] starting step postproc_calls
              [step postproc_calls] start
              [job postproc_calls] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/0o1uqvnr$ postproc_calls \
                  -in_calls \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/fp21y2ml/stg00953b87-8612-498c-bdc3-5e6702519d74/adaptor_calls.jsonl \
                  -logfile \
                  postproc_calls.log \
                  -input_mapping \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/fp21y2ml/stg1cbc01c0-60c7-42fc-a3ea-af2acc797238/seq_mapping.jsonl \
                  -out_file \
                  combined.calls.jsonl
              [job postproc_calls] completed success
              [step postproc_calls] completed success
              [workflow post_processor] starting step log_4
              [step log_4] start
              [job log_4] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/_xp0bvij$ cxxlog2pb \
                  --stage \
                  PostProcessCalls < /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/wcjquqkp/stgaba1a966-2d8d-44b0-850b-a67f8d6d8d95/postproc_calls.log > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/_xp0bvij/3bc8758dc9026d571fb8c6b8383da3db38612251
              [job log_4] Max memory used: 39MiB
              [job log_4] completed success
              [step log_4] completed success
              [workflow post_processor] completed success
              [step post_processor] completed success
              [workflow ] starting step GenerateCleanedFasta
              [step GenerateCleanedFasta] start
              [workflow GenerateCleanedFasta] start
              [workflow GenerateCleanedFasta] starting step prepare_xml_step
              [step prepare_xml_step] start
              [job prepare_xml_step] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/psbxj_ze$ pbcalls2seqtransform \
                  --skipped \
                  skipped_trims.jsonl < /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/l52u1l2p/stgbd616233-afc6-4fb6-9cb0-49f21de18b2e/combined.calls.jsonl > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/psbxj_ze/fcs_calls.xml
              [job prepare_xml_step] Max memory used: 41MiB
              [job prepare_xml_step] completed success
              [step prepare_xml_step] completed success
              [workflow GenerateCleanedFasta] starting step seqtransform_step
              [step seqtransform_step] start
              [job seqtransform_step] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/l4mz4kz6$ seqtransform \
                  -out \
                  validated.fna_0.cleaned_fa \
                  -in \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/2x0da4kr/stg60fac05d-6907-4dd3-b365-c7f2140dfc38/validated.fna_0.fna \
                  -seqaction-xml-file \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/2x0da4kr/stg2fcacd49-4f43-4b8f-8773-6358946a94b1/fcs_calls.xml \
                  -report \
                  seqtransform.log
              [job seqtransform_step] completed success
              [step seqtransform_step] completed success
              [workflow GenerateCleanedFasta] completed success
              [step GenerateCleanedFasta] completed success
              [workflow ] starting step all_cleaned_fasta
              [step all_cleaned_fasta] start
              [step all_cleaned_fasta] completed success
              [workflow ] starting step collect_logs
              [step collect_logs] start
              [job collect_logs] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/chg7uli2$ cat \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/vj_ero7q/stg53003d6e-6d87-43e5-8dc8-98234c690569/validate_fasta.log \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/vj_ero7q/stge41e5352-f1b0-4085-906f-36f017ccef95/par_sec_logs.log \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/vj_ero7q/stg0cbd2dca-4b4d-48a9-aeb7-49bd0e9defc4/3bc8758dc9026d571fb8c6b8383da3db38612251 > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/chg7uli2/logs.jsonl
              [job collect_logs] completed success
              [step collect_logs] completed success
              [workflow ] starting step all_skipped_trims
              [step all_skipped_trims] start
              [job all_skipped_trims] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/b45_sg1i$ cat \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/i1xyxdv7/stg07d4e9b0-3236-4e78-80d8-bbc66dc39f11/skipped_trims.jsonl > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/b45_sg1i/skipped_trims.jsonl
              [job all_skipped_trims] completed success
              [step all_skipped_trims] completed success
              [workflow ] starting step GenerateReport
              [step GenerateReport] start
              [workflow GenerateReport] start
              [workflow GenerateReport] starting step calls_step
              [step calls_step] start
              [job calls_step] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/yw3qy2pa$ pbcalls2tsv < /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/j02n42zw/stga6336dac-94ce-404a-9f0a-b0efcf00f6e0/combined.calls.jsonl > /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/yw3qy2pa/fcs_adaptor_report.txt
              [job calls_step] Max memory used: 17MiB
              [job calls_step] completed success
              [step calls_step] completed success
              [workflow GenerateReport] starting step log_step
              [step log_step] start
              [job log_step] /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/3hmokc_1$ log_jl2tsv \
                  --infile \
                  /tmp/tmp6wl2lhgi/job_working_directory/000/2/tmp/4kq9mnck/stg49b054b0-4421-4ec5-ab8e-7db74f217d52/logs.jsonl \
                  --outfile \
                  fcs.log
              [job log_step] Max memory used: 16MiB
              [job log_step] completed success
              [step log_step] completed success
              [workflow GenerateReport] completed success
              [step GenerateReport] completed success
              [workflow ] completed success
              

            Standard Output:

            • Output will be placed in: /tmp/tmp6wl2lhgi/job_working_directory/000/2/working
              Executing the workflow
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "3e8a1d8c851611f0ac69000d3a339fc8"
              advanced {"optional_log": null}
              chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              tax "--euk"
      • Step 6: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Adaptor Action report:

            • step_state: scheduled
          • Step 2: wc_gnu:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • echo "#lines" > /tmp/tmp6wl2lhgi/job_working_directory/000/3/outputs/dataset_51eaa403-56fa-412a-85a9-a62d054d5b0b.dat &&  cat '/tmp/tmp6wl2lhgi/files/b/e/6/dataset_be60d821-375e-43c0-8965-4e1267791f83.dat' | wc -l | awk '{ print $1 }' >> /tmp/tmp6wl2lhgi/job_working_directory/000/3/outputs/dataset_51eaa403-56fa-412a-85a9-a62d054d5b0b.dat

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "3e8a1d8d851611f0ac69000d3a339fc8"
                  chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  include_header true
                  options ["lines"]
          • Step 11: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_cat/9.5+galaxy0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cat '/tmp/tmp6wl2lhgi/files/5/6/a/dataset_56a54b61-9b51-44a2-bad3-197b4e022ce4.dat' >> '/tmp/tmp6wl2lhgi/job_working_directory/000/12/outputs/dataset_023b2ff8-dbe2-4b07-81b8-e6cbbf93c135.dat' && cat '/tmp/tmp6wl2lhgi/files/0/f/d/dataset_0fd48860-6ab2-49e5-8435-2c823fdb04c7.dat' >> '/tmp/tmp6wl2lhgi/job_working_directory/000/12/outputs/dataset_023b2ff8-dbe2-4b07-81b8-e6cbbf93c135.dat' && cat '/tmp/tmp6wl2lhgi/files/8/d/c/dataset_8dc66296-bef5-4a88-9de3-f06e1bda19db.dat' >> '/tmp/tmp6wl2lhgi/job_working_directory/000/12/outputs/dataset_023b2ff8-dbe2-4b07-81b8-e6cbbf93c135.dat' && exit 0

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "3e8a1d8d851611f0ac69000d3a339fc8"
                  chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  queries [{"__index__": 0, "inputs2": {"values": [{"id": 9, "src": "hda"}]}}, {"__index__": 1, "inputs2": {"values": [{"id": 10, "src": "hda"}]}}]
          • Step 12: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "3e8a1d8d851611f0ac69000d3a339fc8"
                  chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 13, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 2, "src": "hda"}]}}]}}
          • Step 3: Show tail1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • set -eo pipefail; ( cat '/tmp/tmp6wl2lhgi/files/5/1/e/dataset_51eaa403-56fa-412a-85a9-a62d054d5b0b.dat' | tail -n 1 ) > '/tmp/tmp6wl2lhgi/job_working_directory/000/4/outputs/dataset_e85b6fc7-2d13-4ede-aef7-8b4476e3afa0.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "3e8a1d8d851611f0ac69000d3a339fc8"
                  chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  header false
                  lineNum "1"
          • Step 4: param_value_from_file:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "3e8a1d8d851611f0ac69000d3a339fc8"
                  chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  param_type "integer"
                  remove_newlines true
          • Step 5: toolshed.g2.bx.psu.edu/repos/iuc/map_param_value/map_param_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "3e8a1d8d851611f0ac69000d3a339fc8"
                  chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  input_param_type {"__current_case__": 1, "input_param": "12", "mappings": [{"__index__": 0, "from": "1", "to": "False"}], "type": "integer"}
                  output_param_type "boolean"
                  unmapped {"__current_case__": 2, "default_value": "True", "on_unmapped": "default"}
          • Step 6: Select middle adaptors:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • python '/tmp/tmp6wl2lhgi/galaxy-dev/tools/stats/filtering.py' '/tmp/tmp6wl2lhgi/files/b/e/6/dataset_be60d821-375e-43c0-8965-4e1267791f83.dat' '/tmp/tmp6wl2lhgi/job_working_directory/000/7/outputs/dataset_91d80cb1-781f-4ad3-987e-7114944c0c35.dat' '/tmp/tmp6wl2lhgi/job_working_directory/000/7/configs/tmpkpzhmpx7' 5 "str,int,str,str,str" 1

                Exit Code:

                • 0

                Standard Output:

                • Filtering with int(c4.split('..')[0])>100 and int(c4.split('..')[1])<int(c2)-100, 
                  kept 50.00% of 12 valid lines (12 total lines).
                  Skipped 1 invalid line(s) starting at line #12: "seq_00018	522	ACTION_EXCLUDE		CONTAMINATION_SOURCE_TYPE_ADAPTOR:NGB00360.1:Illumina PCR Primer"
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "3e8a1d8d851611f0ac69000d3a339fc8"
                  chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  cond "int(c4.split('..')[0])>100 and int(c4.split('..')[1])<int(c2)-100"
                  dbkey "?"
                  header_lines "1"
          • Step 7: Select end adaptors:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • python '/tmp/tmp6wl2lhgi/galaxy-dev/tools/stats/filtering.py' '/tmp/tmp6wl2lhgi/files/b/e/6/dataset_be60d821-375e-43c0-8965-4e1267791f83.dat' '/tmp/tmp6wl2lhgi/job_working_directory/000/8/outputs/dataset_0fd48860-6ab2-49e5-8435-2c823fdb04c7.dat' '/tmp/tmp6wl2lhgi/job_working_directory/000/8/configs/tmp1vyex67c' 5 "str,int,str,str,str" 0

                Exit Code:

                • 0

                Standard Output:

                • Filtering with int(c4.split('..')[0])<=100 or int(c4.split('..')[1])>=int(c2)-100, 
                  kept 45.45% of 11 valid lines (12 total lines).
                  Skipped 1 invalid line(s) starting at line #12: "seq_00018	522	ACTION_EXCLUDE		CONTAMINATION_SOURCE_TYPE_ADAPTOR:NGB00360.1:Illumina PCR Primer"
                  Skipped 1 comment (starting with #) or blank line(s)
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "3e8a1d8d851611f0ac69000d3a339fc8"
                  chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  cond "int(c4.split('..')[0])<=100 or int(c4.split('..')[1])>=int(c2)-100"
                  dbkey "?"
                  header_lines "0"
          • Step 8: Select sequences to exclude:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • python '/tmp/tmp6wl2lhgi/galaxy-dev/tools/stats/filtering.py' '/tmp/tmp6wl2lhgi/files/b/e/6/dataset_be60d821-375e-43c0-8965-4e1267791f83.dat' '/tmp/tmp6wl2lhgi/job_working_directory/000/9/outputs/dataset_8dc66296-bef5-4a88-9de3-f06e1bda19db.dat' '/tmp/tmp6wl2lhgi/job_working_directory/000/9/configs/tmpfjlj2tj9' 5 "str,int,str,str,str" 0

                Exit Code:

                • 0

                Standard Output:

                • Filtering with c4=="", 
                  kept 9.09% of 11 valid lines (12 total lines).
                  Skipped 1 comment (starting with #) or blank line(s)
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "3e8a1d8d851611f0ac69000d3a339fc8"
                  chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  cond "c4==\"\""
                  dbkey "?"
                  header_lines "0"
          • Step 9: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "3e8a1d8d851611f0ac69000d3a339fc8"
                  chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 8, "src": "hda"}]}}]}}
          • Step 10: Replace trimming by masking:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • awk -v OFS="\t" -v FS="\t" --re-interval --sandbox '{ $3 = gensub( /ACTION_TRIM/, "FIX", "g", $3 ) ; print $0 ; }' '/tmp/tmp6wl2lhgi/files/9/1/d/dataset_91d80cb1-781f-4ad3-987e-7114944c0c35.dat' > '/tmp/tmp6wl2lhgi/job_working_directory/000/11/outputs/dataset_56a54b61-9b51-44a2-bad3-197b4e022ce4.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "3e8a1d8d851611f0ac69000d3a339fc8"
                  chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "column": "3", "find_pattern": "ACTION_TRIM", "replace_pattern": "FIX"}]
      • Step 7: toolshed.g2.bx.psu.edu/repos/iuc/ncbi_fcs_gx/ncbi_fcs_gx/0.5.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • GX_NUM_CORES=${GALAXY_SLOTS:-2} gx clean-genome --input '/tmp/tmp6wl2lhgi/files/5/a/3/dataset_5a34ee9e-a82b-4aad-b2c5-f93ccbd8e4c3.dat' --action-report '/tmp/tmp6wl2lhgi/files/0/2/3/dataset_023b2ff8-dbe2-4b07-81b8-e6cbbf93c135.dat' --contam-fasta-out 'contam.fa' --min-seq-len '200' --output 'clean.fa'

            Exit Code:

            • 0

            Standard Error:

            • Applied 11 actions; 6 seqs dropped; 812 bps dropped; 0 bps lowercased; 290 bps hardmasked.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "3e8a1d8c851611f0ac69000d3a339fc8"
              chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode {"__current_case__": 1, "action_report": {"values": [{"id": 14, "src": "hda"}]}, "input": {"values": [{"id": 1, "src": "hda"}]}, "min_seq_len": "200", "mode_selector": "clean"}
      • **Step 8: soft-masking **:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • dustmasker -in '/tmp/tmp6wl2lhgi/files/5/7/e/dataset_57ea4348-b941-4729-909b-457c2caa20e1.dat' -infmt fasta -out '/tmp/tmp6wl2lhgi/job_working_directory/000/15/outputs/dataset_e68a4183-ae3c-4ac3-b36b-d97c5989619b.dat' -window 64 -level 40 -linker 1 -outfmt fasta

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "3e8a1d8c851611f0ac69000d3a339fc8"
              chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              db_opts {"__current_case__": 2, "database": "", "db_opts_selector": "file", "histdb": "", "subject": {"values": [{"id": 16, "src": "hda"}]}}
              dbkey "?"
              level "40"
              linker "1"
              outformat "fasta"
              window "64"
      • Step 9: toolshed.g2.bx.psu.edu/repos/devteam/fasta_filter_by_length/fasta_filter_by_length/1.2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/fasta_filter_by_length/8cacfcf96a52/fasta_filter_by_length/fasta_filter_by_length.py' '/tmp/tmp6wl2lhgi/files/e/6/8/dataset_e68a4183-ae3c-4ac3-b36b-d97c5989619b.dat' 0 0 '/tmp/tmp6wl2lhgi/job_working_directory/000/16/outputs/dataset_8d966592-173f-41b0-8a60-3ee9f0e1801a.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "3e8a1d8c851611f0ac69000d3a339fc8"
              chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              max_length "0"
              min_length "0"
      • Step 10: toolshed.g2.bx.psu.edu/repos/iuc/ncbi_fcs_gx/ncbi_fcs_gx/0.5.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is error

            Command Line:

            • if [ ! -e "${TMPDIR}/sync-files-completed.txt" ]; then mkdir -p "${TMPDIR}" && sync_files.py get --mft 'https://ncbi-fcs-gx.s3.amazonaws.com/gxdb/latest/all.manifest' --dir "${TMPDIR}" > /dev/null 2>&1 && touch "${TMPDIR}/sync-files-completed.txt"; fi && GX_NUM_CORES=${GALAXY_SLOTS:-2} run_gx.py --phone-home-label 'usegalaxy.org' --fasta '/tmp/tmp6wl2lhgi/files/e/6/8/dataset_e68a4183-ae3c-4ac3-b36b-d97c5989619b.dat' --tax-id '9606' --species 'Homo Sapiens' --split-fasta 'false' --gx-db "${TMPDIR}" --out-basename output --action-report true --generate-logfile false

            Exit Code:

            • 1

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "3e8a1d8c851611f0ac69000d3a339fc8"
              chromInfo "/tmp/tmp6wl2lhgi/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode {"__current_case__": 0, "config_tag": "all", "fasta": {"values": [{"id": 17, "src": "hda"}]}, "id": {"__current_case__": 1, "id_selector": "ncbi_tax", "tax_id": "9606"}, "mode_selector": "screen", "screen_adv": {"div": "", "gx_align_exclude_taxa": "", "gx_extra_contam_divs": null, "ignore_same_kingdom": false, "split_fasta": false}, "species": "Homo Sapiens"}
    • Other invocation details
      • history_id

        • 8b8f1c7eebd5694b
      • history_state

        • error
      • invocation_id

        • 8b8f1c7eebd5694b
      • invocation_state

        • scheduled
      • workflow_id

        • aa18d30e5b2397fd

Copy link

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 1
Passed 0
Error 1
Failure 0
Skipped 0
Errored Tests
  • ❌ Assembly-decontamination-VGP9.ga_0

    Workflow invocation details

    • Invocation Messages

    • Steps
      • Step 1: Scaffolded assembly (fasta):

        • step_state: scheduled
      • Step 2: Taxonomic Identifier:

        • step_state: scheduled
      • Step 11: hard-masking:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • sed --sandbox -r -f '/tmp/tmpaki4qwg0/job_working_directory/000/19/configs/tmpfq1cjrm3' '/tmp/tmpaki4qwg0/files/b/c/5/dataset_bc5429e4-340f-4fda-8348-e4b523f40087.dat' > '/tmp/tmpaki4qwg0/job_working_directory/000/19/outputs/dataset_b6ba9e9b-ea9e-4356-aef3-b7338854de77.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "a5275df2853411f0ac697ced8d1b08f5"
              adv_opts {"__current_case__": 0, "adv_opts_selector": "basic"}
              chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              code "/^>/!y/atcgn/NNNNN/"
              dbkey "?"
      • Step 12: toolshed.g2.bx.psu.edu/repos/iuc/ncbi_fcs_gx/ncbi_fcs_gx/0.5.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "a5275df2853411f0ac697ced8d1b08f5"
              chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode {"__current_case__": 1, "action_report": {"values": [{"id": 21, "src": "hda"}]}, "input": {"values": [{"id": 18, "src": "hda"}]}, "min_seq_len": "200", "mode_selector": "clean"}
      • Step 13: blast mitochondria DB:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • blastn  -query '/tmp/tmpaki4qwg0/files/b/6/b/dataset_b6ba9e9b-ea9e-4356-aef3-b7338854de77.dat'   -db '"/cvmfs/data.galaxyproject.org/byhand/refseq/mitochondrion/genomic/2022-03-10/mitochondrion"'  -task 'blastn' -evalue '0.001' -out '/tmp/tmpaki4qwg0/job_working_directory/000/21/outputs/dataset_7d1d8432-ea6f-4d56-8de3-d7a747e91277.dat' -outfmt '6 qseqid sseqid length qstart qend evalue qlen qcovs qcovhsp'  -num_threads "${GALAXY_SLOTS:-8}"

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "a5275df2853411f0ac697ced8d1b08f5"
              adv_opts {"__current_case__": 0, "adv_opts_selector": "basic"}
              blast_type "blastn"
              chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              db_opts {"__current_case__": 0, "database": ["refseq_mitochondrion"], "db_opts_selector": "db", "histdb": "", "subject": ""}
              dbkey "?"
              evalue_cutoff "0.001"
              output {"__current_case__": 2, "ext_cols": ["qlen"], "ids_cols": null, "misc_cols": ["qcovs", "qcovhsp"], "out_format": "cols", "std_cols": ["qseqid", "sseqid", "length", "qstart", "qend", "evalue"], "tax_cols": null}
      • Step 14: parsing blast output:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • parse_mito_blast.py --blastout '/tmp/tmpaki4qwg0/files/7/d/1/dataset_7d1d8432-ea6f-4d56-8de3-d7a747e91277.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "a5275df2853411f0ac697ced8d1b08f5"
              chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
      • **Step 15: removing scaffolds **:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "a5275df2853411f0ac697ced8d1b08f5"
              chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode_condition {"__current_case__": 0, "discover_paths": false, "homopolymer_compress": null, "output_condition": {"__current_case__": 1, "line_length": null, "out_format": "fasta.gz"}, "remove_terminal_gaps": true, "selector": "manipulation", "sort": "", "swiss_army_knife": null}
              target_condition {"__current_case__": 1, "exclude_bed": {"values": [{"id": 27, "src": "hda"}]}, "include_bed": null, "target_option": "true", "target_sequence": ""}
      • Step 3: Species Binomial Name:

        • step_state: scheduled
      • Step 4: Maximum length of sequence to consider for mitochondrial scaffolds:

        • step_state: scheduled
      • Step 5: toolshed.g2.bx.psu.edu/repos/richard-burhans/ncbi_fcs_adaptor/ncbi_fcs_adaptor/0.5.0+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • /app/fcs/bin/av_screen_x -o "$(pwd)" '--euk' '/tmp/tmpaki4qwg0/files/c/9/b/dataset_c9b0f52c-388b-4cc7-bd32-5dd21fee60cd.dat'

            Exit Code:

            • 0

            Standard Error:

            • Resolved '/app/fcs/progs/ForeignContaminationScreening.cwl' to 'file:///app/fcs/progs/ForeignContaminationScreening.cwl'
              [workflow ] start
              [workflow ] starting step ValidateInputSequences
              [step ValidateInputSequences] start
              [job ValidateInputSequences] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/ojl03c6c$ validate_fasta \
                  --jsonl \
                  validate_fasta.log \
                  --fasta-output \
                  validated.fna \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/p58qmt87/stg9cb138a0-2a32-44df-8111-68afc000ead7/dataset_c9b0f52c-388b-4cc7-bd32-5dd21fee60cd.dat > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/ojl03c6c/validate_fasta.txt
              [job ValidateInputSequences] Max memory used: 39MiB
              [job ValidateInputSequences] completed success
              [step ValidateInputSequences] completed success
              [workflow ] starting step parallel_section
              [step parallel_section] start
              [workflow parallel_section] start
              [workflow parallel_section] starting step SplitInputSequences
              [step SplitInputSequences] start
              [workflow SplitInputSequences] start
              [workflow SplitInputSequences] starting step fasta_split
              [step fasta_split] start
              [job fasta_split] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/7x67e3q1$ fasta_split \
                  -in_file \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/_2kv4i72/stg2aee2064-76b7-4d44-928e-d61787cf19b2/validated.fna_0.fna \
                  -out_file \
                  split_fasta.fna \
                  -logfile \
                  fast_split.log \
                  -mapping_json \
                  seq_mapping.jsonl
              protobuf arena allocated space: 10000000, used: 15304
              [job fasta_split] completed success
              [step fasta_split] completed success
              [workflow SplitInputSequences] starting step log
              [step log] start
              [job log] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/crzk72h4$ cxxlog2pb \
                  --stage \
                  SplitInputSequences < /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/k1efc2cy/stg88d7c8a7-32f3-4757-8d1f-108d47fec2bb/fast_split.log > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/crzk72h4/45ec615881c92e938694dd3ccaea7a9c03748313
              [job log] Max memory used: 37MiB
              [job log] completed success
              [step log] completed success
              [workflow SplitInputSequences] completed success
              [step SplitInputSequences] completed success
              [workflow parallel_section] starting step AdaptorScreeningAndFilterResults
              [step AdaptorScreeningAndFilterResults] start
              [workflow AdaptorScreeningAndFilterResults] start
              [workflow AdaptorScreeningAndFilterResults] starting step blast
              [step blast] start
              [job blast] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/jegoy6f0$ vecscreen \
                  -db \
                  adaptors_for_euks \
                  -logfile \
                  vecscreen.log \
                  -out \
                  vs_unfiltered.hit \
                  -query \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/ejpypa8x/stg6cf1d028-fa9a-4e32-8798-3f137d62eb7d/split_fasta.fna \
                  -term-flex \
                  25
              [job blast] completed success
              [step blast] completed success
              [workflow AdaptorScreeningAndFilterResults] starting step log_2
              [step log_2] start
              [job log_2] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/jmmif7gk$ cxxlog2pb \
                  --stage \
                  AdaptorScreening < /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/poibvtjn/stga2195c99-b66e-4aa7-806e-a5d335a62abd/vecscreen.log > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/jmmif7gk/45ec615881c92e938694dd3ccaea7a9c03748313
              [job log_2] Max memory used: 39MiB
              [job log_2] completed success
              [step log_2] completed success
              [workflow AdaptorScreeningAndFilterResults] starting step filter
              [step filter] start
              [job filter] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/m3hfoveu$ vecscreen_filter \
                  --filtered \
                  vs_filtered.jsonl \
                  --unfiltered \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/gw_zlk5c/stg04a5619b-9eb2-4380-bbee-6cdeb04581b9/vs_unfiltered.hit
              [job filter] Max memory used: 16MiB
              [job filter] completed success
              [step filter] completed success
              [workflow AdaptorScreeningAndFilterResults] completed success
              [step AdaptorScreeningAndFilterResults] completed success
              [workflow parallel_section] starting step ApplyHeuristicsToMakeExcludeAndTrimCalls
              [step ApplyHeuristicsToMakeExcludeAndTrimCalls] start
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] start
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] starting step make_calls
              [step make_calls] start
              [job make_calls] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/swm36puf$ make_calls \
                  -a \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/nah6x5er/stg9322ff65-2c28-4493-ba37-23b0d0520c71/vs_filtered.jsonl \
                  -logfile \
                  make_calls.log \
                  -seq-len \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/nah6x5er/stg9cf6d56d-9b75-4d9e-88f7-d4c325d0c887/seq_mapping.jsonl > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/swm36puf/combined.calls.jsonl
              [job make_calls] completed success
              [step make_calls] completed success
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] starting step log_3
              [step log_3] start
              [job log_3] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/bj6jsz1y$ cxxlog2pb \
                  --stage \
                  ApplyHeuristicsToMakeExcludeAndTrimCalls < /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/8iv63_i2/stgd59ca841-0e20-4060-a803-cc3ab0510043/make_calls.log > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/bj6jsz1y/45ec615881c92e938694dd3ccaea7a9c03748313
              [job log_3] Max memory used: 38MiB
              [job log_3] completed success
              [step log_3] completed success
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] completed success
              [step ApplyHeuristicsToMakeExcludeAndTrimCalls] completed success
              [workflow parallel_section] starting step log_merging
              [step log_merging] start
              [job log_merging] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/wfdh_b90$ cat \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/t3qul2dn/stg25f274b7-79da-4cde-af1c-7e7f5e818f98/45ec615881c92e938694dd3ccaea7a9c03748313 \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/t3qul2dn/stgf0d526aa-f1a7-42e5-bfd2-4c434b64363e/45ec615881c92e938694dd3ccaea7a9c03748313 \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/t3qul2dn/stgdcae7f88-f9cc-4e1b-ad90-43c8aa57792b/45ec615881c92e938694dd3ccaea7a9c03748313 > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/wfdh_b90/par_sec.log
              [job log_merging] completed success
              [step log_merging] completed success
              [workflow parallel_section] completed success
              [step parallel_section] completed success
              [workflow ] starting step gather_logs
              [step gather_logs] start
              [job gather_logs] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/yvs5wq0t$ cat \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/dv0_07to/stg3e8d9390-e569-41b4-aa2d-00c4de86f6f1/par_sec.log > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/yvs5wq0t/par_sec_logs.log
              [job gather_logs] completed success
              [step gather_logs] completed success
              [workflow ] starting step seq_mapping
              [step seq_mapping] start
              [job seq_mapping] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/6qtlmyd5$ cat \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/wk5_115_/stg7c1a13de-f3fd-4f0a-9a89-8c2c26edb107/seq_mapping.jsonl > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/6qtlmyd5/seq_mapping.jsonl
              [job seq_mapping] completed success
              [step seq_mapping] completed success
              [workflow ] starting step adaptor_calls
              [step adaptor_calls] start
              [job adaptor_calls] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/x7jlrlgo$ cat \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/qhle807l/stg0069ff73-89ff-40bb-a7ce-1db7129810fb/combined.calls.jsonl > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/x7jlrlgo/adaptor_calls.jsonl
              [job adaptor_calls] completed success
              [step adaptor_calls] completed success
              [workflow ] starting step post_processor
              [step post_processor] start
              [workflow post_processor] start
              [workflow post_processor] starting step postproc_calls
              [step postproc_calls] start
              [job postproc_calls] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/n9fv173_$ postproc_calls \
                  -in_calls \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/v1b82fkf/stg953a8db3-2ab7-461d-9a75-a2d79df26903/adaptor_calls.jsonl \
                  -logfile \
                  postproc_calls.log \
                  -input_mapping \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/v1b82fkf/stg6a26dcef-6c08-4909-8eca-b9bc2898f410/seq_mapping.jsonl \
                  -out_file \
                  combined.calls.jsonl
              [job postproc_calls] completed success
              [step postproc_calls] completed success
              [workflow post_processor] starting step log_4
              [step log_4] start
              [job log_4] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/2axk1j9c$ cxxlog2pb \
                  --stage \
                  PostProcessCalls < /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/lkehloyh/stg7b8c1b48-120a-4e4a-9fdc-5607e0e952c5/postproc_calls.log > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/2axk1j9c/3bc8758dc9026d571fb8c6b8383da3db38612251
              [job log_4] Max memory used: 39MiB
              [job log_4] completed success
              [step log_4] completed success
              [workflow post_processor] completed success
              [step post_processor] completed success
              [workflow ] starting step GenerateCleanedFasta
              [step GenerateCleanedFasta] start
              [workflow GenerateCleanedFasta] start
              [workflow GenerateCleanedFasta] starting step prepare_xml_step
              [step prepare_xml_step] start
              [job prepare_xml_step] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/xlsl1lm4$ pbcalls2seqtransform \
                  --skipped \
                  skipped_trims.jsonl < /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/7efkz7mz/stg96ee4a64-cd77-4b6e-9765-2c50f617e493/combined.calls.jsonl > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/xlsl1lm4/fcs_calls.xml
              [job prepare_xml_step] Max memory used: 40MiB
              [job prepare_xml_step] completed success
              [step prepare_xml_step] completed success
              [workflow GenerateCleanedFasta] starting step seqtransform_step
              [step seqtransform_step] start
              [job seqtransform_step] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/s8xw9ngi$ seqtransform \
                  -out \
                  validated.fna_0.cleaned_fa \
                  -in \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/rk0del_5/stgd71628cc-304d-446a-8aa9-f6fa1c02c9d4/validated.fna_0.fna \
                  -seqaction-xml-file \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/rk0del_5/stgaabf9b56-dc9c-451a-8b4b-3377b77fc667/fcs_calls.xml \
                  -report \
                  seqtransform.log
              [job seqtransform_step] completed success
              [step seqtransform_step] completed success
              [workflow GenerateCleanedFasta] completed success
              [step GenerateCleanedFasta] completed success
              [workflow ] starting step collect_logs
              [step collect_logs] start
              [job collect_logs] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/cmdig0qf$ cat \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/8o2xdw7l/stg02e90ce9-f5fd-4073-b8fb-b8d8713639f2/validate_fasta.log \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/8o2xdw7l/stg87e19fef-2142-4125-a78c-fcd952d8e074/par_sec_logs.log \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/8o2xdw7l/stg2c6e58eb-6ff4-4b04-aa83-4de7c1e870d7/3bc8758dc9026d571fb8c6b8383da3db38612251 > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/cmdig0qf/logs.jsonl
              [job collect_logs] completed success
              [step collect_logs] completed success
              [workflow ] starting step GenerateReport
              [step GenerateReport] start
              [workflow GenerateReport] start
              [workflow GenerateReport] starting step log_step
              [step log_step] start
              [job log_step] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/a0c7kw60$ log_jl2tsv \
                  --infile \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/4_h703ca/stg823ed4d6-11d9-4c06-9fc3-e74a1a7cd11a/logs.jsonl \
                  --outfile \
                  fcs.log
              [job log_step] Max memory used: 16MiB
              [job log_step] completed success
              [step log_step] completed success
              [workflow GenerateReport] starting step calls_step
              [step calls_step] start
              [job calls_step] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/lwbcjnvb$ pbcalls2tsv < /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/f2s42b9v/stga269153b-ac23-4f55-8650-54a9c06d85bd/combined.calls.jsonl > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/lwbcjnvb/fcs_adaptor_report.txt
              [job calls_step] Max memory used: 17MiB
              [job calls_step] completed success
              [step calls_step] completed success
              [workflow GenerateReport] completed success
              [step GenerateReport] completed success
              [workflow ] starting step all_skipped_trims
              [step all_skipped_trims] start
              [job all_skipped_trims] /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/rxj6fc06$ cat \
                  /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/ze0_5od2/stg01a364bf-04b2-464f-b19c-ec21514a25fa/skipped_trims.jsonl > /tmp/tmpaki4qwg0/job_working_directory/000/3/tmp/rxj6fc06/skipped_trims.jsonl
              [job all_skipped_trims] completed success
              [step all_skipped_trims] completed success
              [workflow ] starting step all_cleaned_fasta
              [step all_cleaned_fasta] start
              [step all_cleaned_fasta] completed success
              [workflow ] completed success
              

            Standard Output:

            • Output will be placed in: /tmp/tmpaki4qwg0/job_working_directory/000/3/working
              Executing the workflow
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "a5275df2853411f0ac697ced8d1b08f5"
              advanced {"optional_log": null}
              chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              tax "--euk"
      • Step 6: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Adaptor Action report:

            • step_state: scheduled
          • Step 2: wc_gnu:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • echo "#lines" > /tmp/tmpaki4qwg0/job_working_directory/000/4/outputs/dataset_a1909fdd-75bb-4de5-a295-e6a773e0f2db.dat &&  cat '/tmp/tmpaki4qwg0/files/8/e/c/dataset_8ec0e44d-1635-4b35-94fc-f80e25b969dc.dat' | wc -l | awk '{ print $1 }' >> /tmp/tmpaki4qwg0/job_working_directory/000/4/outputs/dataset_a1909fdd-75bb-4de5-a295-e6a773e0f2db.dat

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "a5275df3853411f0ac697ced8d1b08f5"
                  chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  include_header true
                  options ["lines"]
          • Step 11: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_cat/9.5+galaxy0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cat '/tmp/tmpaki4qwg0/files/5/7/c/dataset_57cd9c11-6c22-4988-b269-a17064b7ca5b.dat' >> '/tmp/tmpaki4qwg0/job_working_directory/000/13/outputs/dataset_80550061-0e2e-40b6-bfbb-d58faa9a1989.dat' && cat '/tmp/tmpaki4qwg0/files/c/e/c/dataset_cecc3041-381b-465d-8b3f-b25af3e14463.dat' >> '/tmp/tmpaki4qwg0/job_working_directory/000/13/outputs/dataset_80550061-0e2e-40b6-bfbb-d58faa9a1989.dat' && cat '/tmp/tmpaki4qwg0/files/3/8/e/dataset_38e99b09-6bd7-4ffc-befb-6f93952b609b.dat' >> '/tmp/tmpaki4qwg0/job_working_directory/000/13/outputs/dataset_80550061-0e2e-40b6-bfbb-d58faa9a1989.dat' && exit 0

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "a5275df3853411f0ac697ced8d1b08f5"
                  chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  queries [{"__index__": 0, "inputs2": {"values": [{"id": 10, "src": "hda"}]}}, {"__index__": 1, "inputs2": {"values": [{"id": 11, "src": "hda"}]}}]
          • Step 12: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "a5275df3853411f0ac697ced8d1b08f5"
                  chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 14, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 3, "src": "hda"}]}}]}}
          • Step 3: Show tail1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • set -eo pipefail; ( cat '/tmp/tmpaki4qwg0/files/a/1/9/dataset_a1909fdd-75bb-4de5-a295-e6a773e0f2db.dat' | tail -n 1 ) > '/tmp/tmpaki4qwg0/job_working_directory/000/5/outputs/dataset_6df64cd3-0214-4c61-a244-63d812cc0bf7.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "a5275df3853411f0ac697ced8d1b08f5"
                  chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  header false
                  lineNum "1"
          • Step 4: param_value_from_file:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "a5275df3853411f0ac697ced8d1b08f5"
                  chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  param_type "integer"
                  remove_newlines true
          • Step 5: toolshed.g2.bx.psu.edu/repos/iuc/map_param_value/map_param_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "a5275df3853411f0ac697ced8d1b08f5"
                  chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  input_param_type {"__current_case__": 1, "input_param": "4", "mappings": [{"__index__": 0, "from": "1", "to": "False"}], "type": "integer"}
                  output_param_type "boolean"
                  unmapped {"__current_case__": 2, "default_value": "True", "on_unmapped": "default"}
          • Step 6: Select middle adaptors:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • python '/tmp/tmpaki4qwg0/galaxy-dev/tools/stats/filtering.py' '/tmp/tmpaki4qwg0/files/8/e/c/dataset_8ec0e44d-1635-4b35-94fc-f80e25b969dc.dat' '/tmp/tmpaki4qwg0/job_working_directory/000/8/outputs/dataset_add03895-1935-42c1-bee3-46fe2d6eed30.dat' '/tmp/tmpaki4qwg0/job_working_directory/000/8/configs/tmpeuij_fwc' 5 "str,int,str,str,str" 1

                Exit Code:

                • 0

                Standard Output:

                • Filtering with int(c4.split('..')[0])>100 and int(c4.split('..')[1])<int(c2)-100, 
                  kept 50.00% of 4 valid lines (4 total lines).
                  Skipped 1 invalid line(s) starting at line #4: "seq_00018	522	ACTION_EXCLUDE		CONTAMINATION_SOURCE_TYPE_ADAPTOR:NGB00360.1:Illumina PCR Primer"
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "a5275df3853411f0ac697ced8d1b08f5"
                  chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  cond "int(c4.split('..')[0])>100 and int(c4.split('..')[1])<int(c2)-100"
                  dbkey "?"
                  header_lines "1"
          • Step 7: Select end adaptors:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • python '/tmp/tmpaki4qwg0/galaxy-dev/tools/stats/filtering.py' '/tmp/tmpaki4qwg0/files/8/e/c/dataset_8ec0e44d-1635-4b35-94fc-f80e25b969dc.dat' '/tmp/tmpaki4qwg0/job_working_directory/000/9/outputs/dataset_cecc3041-381b-465d-8b3f-b25af3e14463.dat' '/tmp/tmpaki4qwg0/job_working_directory/000/9/configs/tmpf9_xj52j' 5 "str,int,str,str,str" 0

                Exit Code:

                • 0

                Standard Output:

                • Filtering with int(c4.split('..')[0])<=100 or int(c4.split('..')[1])>=int(c2)-100, 
                  kept 33.33% of 3 valid lines (4 total lines).
                  Skipped 1 invalid line(s) starting at line #4: "seq_00018	522	ACTION_EXCLUDE		CONTAMINATION_SOURCE_TYPE_ADAPTOR:NGB00360.1:Illumina PCR Primer"
                  Skipped 1 comment (starting with #) or blank line(s)
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "a5275df3853411f0ac697ced8d1b08f5"
                  chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  cond "int(c4.split('..')[0])<=100 or int(c4.split('..')[1])>=int(c2)-100"
                  dbkey "?"
                  header_lines "0"
          • Step 8: Select sequences to exclude:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • python '/tmp/tmpaki4qwg0/galaxy-dev/tools/stats/filtering.py' '/tmp/tmpaki4qwg0/files/8/e/c/dataset_8ec0e44d-1635-4b35-94fc-f80e25b969dc.dat' '/tmp/tmpaki4qwg0/job_working_directory/000/10/outputs/dataset_38e99b09-6bd7-4ffc-befb-6f93952b609b.dat' '/tmp/tmpaki4qwg0/job_working_directory/000/10/configs/tmp_22849ak' 5 "str,int,str,str,str" 0

                Exit Code:

                • 0

                Standard Output:

                • Filtering with c4=="", 
                  kept 33.33% of 3 valid lines (4 total lines).
                  Skipped 1 comment (starting with #) or blank line(s)
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "a5275df3853411f0ac697ced8d1b08f5"
                  chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  cond "c4==\"\""
                  dbkey "?"
                  header_lines "0"
          • Step 9: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "a5275df3853411f0ac697ced8d1b08f5"
                  chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 9, "src": "hda"}]}}]}}
          • Step 10: Replace trimming by masking:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • awk -v OFS="\t" -v FS="\t" --re-interval --sandbox '{ $3 = gensub( /ACTION_TRIM/, "FIX", "g", $3 ) ; print $0 ; }' '/tmp/tmpaki4qwg0/files/a/d/d/dataset_add03895-1935-42c1-bee3-46fe2d6eed30.dat' > '/tmp/tmpaki4qwg0/job_working_directory/000/12/outputs/dataset_57cd9c11-6c22-4988-b269-a17064b7ca5b.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "a5275df3853411f0ac697ced8d1b08f5"
                  chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "column": "3", "find_pattern": "ACTION_TRIM", "replace_pattern": "FIX"}]
      • Step 7: toolshed.g2.bx.psu.edu/repos/iuc/ncbi_fcs_gx/ncbi_fcs_gx/0.5.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • GX_NUM_CORES=${GALAXY_SLOTS:-2} gx clean-genome --input '/tmp/tmpaki4qwg0/files/c/9/b/dataset_c9b0f52c-388b-4cc7-bd32-5dd21fee60cd.dat' --action-report '/tmp/tmpaki4qwg0/files/8/0/5/dataset_80550061-0e2e-40b6-bfbb-d58faa9a1989.dat' --contam-fasta-out 'contam.fa' --min-seq-len '200' --output 'clean.fa'

            Exit Code:

            • 0

            Standard Error:

            • Applied 3 actions; 2 seqs dropped; 580 bps dropped; 0 bps lowercased; 58 bps hardmasked.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "a5275df2853411f0ac697ced8d1b08f5"
              chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode {"__current_case__": 1, "action_report": {"values": [{"id": 15, "src": "hda"}]}, "input": {"values": [{"id": 2, "src": "hda"}]}, "min_seq_len": "200", "mode_selector": "clean"}
      • **Step 8: soft-masking **:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • dustmasker -in '/tmp/tmpaki4qwg0/files/b/b/f/dataset_bbff91ed-3ed9-4ae0-8c38-f70ae69c3167.dat' -infmt fasta -out '/tmp/tmpaki4qwg0/job_working_directory/000/16/outputs/dataset_ddc1e420-cd07-4f72-b4d4-8c51140da37d.dat' -window 64 -level 40 -linker 1 -outfmt fasta

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "a5275df2853411f0ac697ced8d1b08f5"
              chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              db_opts {"__current_case__": 2, "database": "", "db_opts_selector": "file", "histdb": "", "subject": {"values": [{"id": 17, "src": "hda"}]}}
              dbkey "?"
              level "40"
              linker "1"
              outformat "fasta"
              window "64"
      • Step 9: toolshed.g2.bx.psu.edu/repos/devteam/fasta_filter_by_length/fasta_filter_by_length/1.2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/fasta_filter_by_length/8cacfcf96a52/fasta_filter_by_length/fasta_filter_by_length.py' '/tmp/tmpaki4qwg0/files/d/d/c/dataset_ddc1e420-cd07-4f72-b4d4-8c51140da37d.dat' 0 0 '/tmp/tmpaki4qwg0/job_working_directory/000/17/outputs/dataset_bc5429e4-340f-4fda-8348-e4b523f40087.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "a5275df2853411f0ac697ced8d1b08f5"
              chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              max_length "0"
              min_length "0"
      • Step 10: toolshed.g2.bx.psu.edu/repos/iuc/ncbi_fcs_gx/ncbi_fcs_gx/0.5.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is error

            Command Line:

            • if [ ! -e "${TMPDIR}/sync-files-completed.txt" ]; then mkdir -p "${TMPDIR}" && sync_files.py get --mft 'https://ncbi-fcs-gx.s3.amazonaws.com/gxdb/latest/all.manifest' --dir "${TMPDIR}" > /dev/null 2>&1 && touch "${TMPDIR}/sync-files-completed.txt"; fi && GX_NUM_CORES=${GALAXY_SLOTS:-2} run_gx.py --phone-home-label 'usegalaxy.org' --fasta '/tmp/tmpaki4qwg0/files/d/d/c/dataset_ddc1e420-cd07-4f72-b4d4-8c51140da37d.dat' --tax-id '9606' --species 'Homo Sapiens' --split-fasta 'false' --gx-db "${TMPDIR}" --out-basename output --action-report true --generate-logfile false

            Exit Code:

            • 1

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "a5275df2853411f0ac697ced8d1b08f5"
              chromInfo "/tmp/tmpaki4qwg0/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode {"__current_case__": 0, "config_tag": "all", "fasta": {"values": [{"id": 18, "src": "hda"}]}, "id": {"__current_case__": 1, "id_selector": "ncbi_tax", "tax_id": "9606"}, "mode_selector": "screen", "screen_adv": {"div": "", "gx_align_exclude_taxa": "", "gx_extra_contam_divs": null, "ignore_same_kingdom": false, "split_fasta": false}, "species": "Homo Sapiens"}
    • Other invocation details
      • history_id

        • 0a2004fc025d403a
      • history_state

        • error
      • invocation_id

        • 0a2004fc025d403a
      • invocation_state

        • scheduled
      • workflow_id

        • 59d21deb9c54097d

@mvdbeek
Copy link
Member

mvdbeek commented Sep 1, 2025

fcs gx just quits with exit code 1, no message on stderr. I don't know what's up with that.

Copy link

github-actions bot commented Sep 2, 2025

Test Results (powered by Planemo)

Test Summary

Test State Count
Total 1
Passed 0
Error 1
Failure 0
Skipped 0
Errored Tests
  • ❌ Assembly-decontamination-VGP9.ga_0

    Workflow invocation details

    • Invocation Messages

    • Steps
      • Step 1: Scaffolded assembly (fasta):

        • step_state: scheduled
      • Step 2: Taxonomic Identifier:

        • step_state: scheduled
      • **Step 11: soft-masking **:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • dustmasker -in '/tmp/tmpz8it333k/files/c/2/9/dataset_c291792b-dfff-4fbf-8fd6-ab674eb66044.dat' -infmt fasta -out '/tmp/tmpz8it333k/job_working_directory/000/17/outputs/dataset_7e195d06-30eb-49b9-a5ee-891487706e73.dat' -window 64 -level 40 -linker 1 -outfmt fasta

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "5742fb26881a11f0ac697ced8d2915de"
              chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              db_opts {"__current_case__": 2, "database": "", "db_opts_selector": "file", "histdb": "", "subject": {"values": [{"id": 18, "src": "hda"}]}}
              dbkey "?"
              level "40"
              linker "1"
              outformat "fasta"
              window "64"
      • Step 12: toolshed.g2.bx.psu.edu/repos/devteam/fasta_filter_by_length/fasta_filter_by_length/1.2:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • python '/tmp/shed_dir/toolshed.g2.bx.psu.edu/repos/devteam/fasta_filter_by_length/8cacfcf96a52/fasta_filter_by_length/fasta_filter_by_length.py' '/tmp/tmpz8it333k/files/7/e/1/dataset_7e195d06-30eb-49b9-a5ee-891487706e73.dat' 0 0 '/tmp/tmpz8it333k/job_working_directory/000/18/outputs/dataset_e2fd736f-4b9c-4297-97b6-b510b53248a7.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "5742fb26881a11f0ac697ced8d2915de"
              chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              max_length "0"
              min_length "0"
      • Step 13: toolshed.g2.bx.psu.edu/repos/iuc/ncbi_fcs_gx/ncbi_fcs_gx/0.5.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is error

            Command Line:

            • if [ ! -e "${TMPDIR}/sync-files-completed.txt" ]; then mkdir -p "${TMPDIR}" && sync_files.py get --mft 'https://ncbi-fcs-gx.s3.amazonaws.com/gxdb/latest/all.manifest' --dir "${TMPDIR}" > /dev/null 2>&1 && touch "${TMPDIR}/sync-files-completed.txt"; fi && GX_NUM_CORES=${GALAXY_SLOTS:-2} run_gx.py --phone-home-label 'usegalaxy.org' --fasta '/tmp/tmpz8it333k/files/7/e/1/dataset_7e195d06-30eb-49b9-a5ee-891487706e73.dat' --tax-id '9606' --species 'Homo Sapiens' --split-fasta 'true' --gx-db "${TMPDIR}" --out-basename output --action-report true --generate-logfile false

            Exit Code:

            • 1

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "5742fb26881a11f0ac697ced8d2915de"
              chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode {"__current_case__": 0, "config_tag": "all", "fasta": {"values": [{"id": 19, "src": "hda"}]}, "id": {"__current_case__": 1, "id_selector": "ncbi_tax", "tax_id": "9606"}, "mode_selector": "screen", "screen_adv": {"div": "", "gx_align_exclude_taxa": "", "gx_extra_contam_divs": null, "ignore_same_kingdom": false, "split_fasta": true}, "species": "Homo Sapiens"}
      • Step 14: hard-masking:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • sed --sandbox -r -f '/tmp/tmpz8it333k/job_working_directory/000/20/configs/tmpd83133gm' '/tmp/tmpz8it333k/files/e/2/f/dataset_e2fd736f-4b9c-4297-97b6-b510b53248a7.dat' > '/tmp/tmpz8it333k/job_working_directory/000/20/outputs/dataset_2b8b7031-c270-4966-b01d-1942f98b6aa7.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "5742fb26881a11f0ac697ced8d2915de"
              adv_opts {"__current_case__": 0, "adv_opts_selector": "basic"}
              chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              code "/^>/!y/atcgn/NNNNN/"
              dbkey "?"
      • Step 15: toolshed.g2.bx.psu.edu/repos/iuc/ncbi_fcs_gx/ncbi_fcs_gx/0.5.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "5742fb26881a11f0ac697ced8d2915de"
              chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode {"__current_case__": 1, "action_report": {"values": [{"id": 22, "src": "hda"}]}, "input": {"values": [{"id": 19, "src": "hda"}]}, "min_seq_len": "200", "mode_selector": "clean"}
      • Step 16: blast mitochondria DB:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • blastn  -query '/tmp/tmpz8it333k/files/2/b/8/dataset_2b8b7031-c270-4966-b01d-1942f98b6aa7.dat'   -db '"/cvmfs/data.galaxyproject.org/byhand/refseq/mitochondrion/genomic/2022-03-10/mitochondrion"'  -task 'blastn' -evalue '0.001' -out '/tmp/tmpz8it333k/job_working_directory/000/22/outputs/dataset_ecb12d96-7209-4ba4-9292-2a5067bf7f6a.dat' -outfmt '6 qseqid sseqid length qstart qend evalue qlen qcovs qcovhsp'  -num_threads "${GALAXY_SLOTS:-8}"

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "5742fb26881a11f0ac697ced8d2915de"
              adv_opts {"__current_case__": 0, "adv_opts_selector": "basic"}
              blast_type "blastn"
              chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              db_opts {"__current_case__": 0, "database": ["refseq_mitochondrion"], "db_opts_selector": "db", "histdb": "", "subject": ""}
              dbkey "?"
              evalue_cutoff "0.001"
              output {"__current_case__": 2, "ext_cols": ["qlen"], "ids_cols": null, "misc_cols": ["qcovs", "qcovhsp"], "out_format": "cols", "std_cols": ["qseqid", "sseqid", "length", "qstart", "qend", "evalue"], "tax_cols": null}
      • Step 17: parsing blast output:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • parse_mito_blast.py --blastout '/tmp/tmpz8it333k/files/e/c/b/dataset_ecb12d96-7209-4ba4-9292-2a5067bf7f6a.dat'

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "5742fb26881a11f0ac697ced8d2915de"
              chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
      • **Step 18: removing scaffolds **:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is paused

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "5742fb26881a11f0ac697ced8d2915de"
              chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode_condition {"__current_case__": 0, "discover_paths": false, "homopolymer_compress": null, "output_condition": {"__current_case__": 1, "line_length": null, "out_format": "fasta.gz"}, "remove_terminal_gaps": true, "selector": "manipulation", "sort": "", "swiss_army_knife": null}
              target_condition {"__current_case__": 1, "exclude_bed": {"values": [{"id": 28, "src": "hda"}]}, "include_bed": null, "target_option": "true", "target_sequence": ""}
      • Step 3: Species Binomial Name:

        • step_state: scheduled
      • Step 4: Assembly Name:

        • step_state: scheduled
      • Step 5: Haplotype:

        • step_state: scheduled
      • Step 6: Maximum length of sequence to consider for mitochondrial scaffolds:

        • step_state: scheduled
      • Step 7: toolshed.g2.bx.psu.edu/repos/richard-burhans/ncbi_fcs_adaptor/ncbi_fcs_adaptor/0.5.0+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • /app/fcs/bin/av_screen_x -o "$(pwd)" '--euk' '/tmp/tmpz8it333k/files/7/b/e/dataset_7be46a3b-fdb6-4a03-b3b9-a9c6032bb38d.dat'

            Exit Code:

            • 0

            Standard Error:

            • Resolved '/app/fcs/progs/ForeignContaminationScreening.cwl' to 'file:///app/fcs/progs/ForeignContaminationScreening.cwl'
              [workflow ] start
              [workflow ] starting step ValidateInputSequences
              [step ValidateInputSequences] start
              [job ValidateInputSequences] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/_iofqbx_$ validate_fasta \
                  --jsonl \
                  validate_fasta.log \
                  --fasta-output \
                  validated.fna \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/1vtd2y5a/stgb76237e3-6220-4a52-b14e-747931d4721c/dataset_7be46a3b-fdb6-4a03-b3b9-a9c6032bb38d.dat > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/_iofqbx_/validate_fasta.txt
              [job ValidateInputSequences] Max memory used: 39MiB
              [job ValidateInputSequences] completed success
              [step ValidateInputSequences] completed success
              [workflow ] starting step parallel_section
              [step parallel_section] start
              [workflow parallel_section] start
              [workflow parallel_section] starting step SplitInputSequences
              [step SplitInputSequences] start
              [workflow SplitInputSequences] start
              [workflow SplitInputSequences] starting step fasta_split
              [step fasta_split] start
              [job fasta_split] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/wtsy4w0z$ fasta_split \
                  -in_file \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/bnji4hrt/stg7f770f7e-379e-452b-92c3-04d679f6b2bc/validated.fna_0.fna \
                  -out_file \
                  split_fasta.fna \
                  -logfile \
                  fast_split.log \
                  -mapping_json \
                  seq_mapping.jsonl
              protobuf arena allocated space: 10000000, used: 15304
              [job fasta_split] completed success
              [step fasta_split] completed success
              [workflow SplitInputSequences] starting step log
              [step log] start
              [job log] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/7_u5n0pz$ cxxlog2pb \
                  --stage \
                  SplitInputSequences < /tmp/tmpz8it333k/job_working_directory/000/3/tmp/tivfnxku/stge0c91878-d889-4280-add6-348996397232/fast_split.log > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/7_u5n0pz/45ec615881c92e938694dd3ccaea7a9c03748313
              [job log] Max memory used: 40MiB
              [job log] completed success
              [step log] completed success
              [workflow SplitInputSequences] completed success
              [step SplitInputSequences] completed success
              [workflow parallel_section] starting step AdaptorScreeningAndFilterResults
              [step AdaptorScreeningAndFilterResults] start
              [workflow AdaptorScreeningAndFilterResults] start
              [workflow AdaptorScreeningAndFilterResults] starting step blast
              [step blast] start
              [job blast] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/7lkj579h$ vecscreen \
                  -db \
                  adaptors_for_euks \
                  -logfile \
                  vecscreen.log \
                  -out \
                  vs_unfiltered.hit \
                  -query \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/4zhuyvhi/stg2c17b6bb-0d26-4cb4-ae51-00419fc12817/split_fasta.fna \
                  -term-flex \
                  25
              [job blast] completed success
              [step blast] completed success
              [workflow AdaptorScreeningAndFilterResults] starting step filter
              [step filter] start
              [job filter] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/e26xk3r4$ vecscreen_filter \
                  --filtered \
                  vs_filtered.jsonl \
                  --unfiltered \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/l0ac0o9s/stgf433fe94-a5c2-44bb-98cb-ef2bbb296c48/vs_unfiltered.hit
              [job filter] Max memory used: 16MiB
              [job filter] completed success
              [step filter] completed success
              [workflow AdaptorScreeningAndFilterResults] starting step log_2
              [step log_2] start
              [job log_2] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/0_a1yc0t$ cxxlog2pb \
                  --stage \
                  AdaptorScreening < /tmp/tmpz8it333k/job_working_directory/000/3/tmp/35nvv5nr/stg70c7a787-b333-47e9-b9e8-1f2c3748c4fd/vecscreen.log > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/0_a1yc0t/45ec615881c92e938694dd3ccaea7a9c03748313
              [job log_2] Max memory used: 39MiB
              [job log_2] completed success
              [step log_2] completed success
              [workflow AdaptorScreeningAndFilterResults] completed success
              [step AdaptorScreeningAndFilterResults] completed success
              [workflow parallel_section] starting step ApplyHeuristicsToMakeExcludeAndTrimCalls
              [step ApplyHeuristicsToMakeExcludeAndTrimCalls] start
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] start
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] starting step make_calls
              [step make_calls] start
              [job make_calls] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/jmu5104o$ make_calls \
                  -a \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/d6hqg9fa/stg975ea8f4-bc9c-42c2-8cd1-03fdc453c6ff/vs_filtered.jsonl \
                  -logfile \
                  make_calls.log \
                  -seq-len \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/d6hqg9fa/stg96d3ea0b-5b88-4685-b5dd-5217f39245cd/seq_mapping.jsonl > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/jmu5104o/combined.calls.jsonl
              [job make_calls] completed success
              [step make_calls] completed success
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] starting step log_3
              [step log_3] start
              [job log_3] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/0sl6r9rz$ cxxlog2pb \
                  --stage \
                  ApplyHeuristicsToMakeExcludeAndTrimCalls < /tmp/tmpz8it333k/job_working_directory/000/3/tmp/f94ftzd4/stgdf2db286-cf87-42c3-9af0-f8fbe4582462/make_calls.log > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/0sl6r9rz/45ec615881c92e938694dd3ccaea7a9c03748313
              [job log_3] Max memory used: 40MiB
              [job log_3] completed success
              [step log_3] completed success
              [workflow ApplyHeuristicsToMakeExcludeAndTrimCalls] completed success
              [step ApplyHeuristicsToMakeExcludeAndTrimCalls] completed success
              [workflow parallel_section] starting step log_merging
              [step log_merging] start
              [job log_merging] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/hnbf4gda$ cat \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/w6tfzvy6/stg7298b6ee-8ea0-4557-bf4d-7b0a53b7a650/45ec615881c92e938694dd3ccaea7a9c03748313 \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/w6tfzvy6/stg4ab5144a-1037-4f60-83fe-892143ee6e88/45ec615881c92e938694dd3ccaea7a9c03748313 \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/w6tfzvy6/stgc1bff21a-1aeb-41b9-9d88-b832ea03e3e8/45ec615881c92e938694dd3ccaea7a9c03748313 > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/hnbf4gda/par_sec.log
              [job log_merging] completed success
              [step log_merging] completed success
              [workflow parallel_section] completed success
              [step parallel_section] completed success
              [workflow ] starting step seq_mapping
              [step seq_mapping] start
              [job seq_mapping] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/puyzbuz_$ cat \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/f8owpsv9/stg0d702eb3-d45f-4103-af60-58e739e4d2e2/seq_mapping.jsonl > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/puyzbuz_/seq_mapping.jsonl
              [job seq_mapping] completed success
              [step seq_mapping] completed success
              [workflow ] starting step gather_logs
              [step gather_logs] start
              [job gather_logs] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/z4lnqtg4$ cat \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/w9638f89/stg1332e0f9-2841-45c7-b029-f2410a28ef26/par_sec.log > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/z4lnqtg4/par_sec_logs.log
              [job gather_logs] completed success
              [step gather_logs] completed success
              [workflow ] starting step adaptor_calls
              [step adaptor_calls] start
              [job adaptor_calls] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/vcbl0m1o$ cat \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/tyvaisva/stga81b0261-efb9-44bc-a58f-e96b01761556/combined.calls.jsonl > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/vcbl0m1o/adaptor_calls.jsonl
              [job adaptor_calls] completed success
              [step adaptor_calls] completed success
              [workflow ] starting step post_processor
              [step post_processor] start
              [workflow post_processor] start
              [workflow post_processor] starting step postproc_calls
              [step postproc_calls] start
              [job postproc_calls] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/b_siqkrj$ postproc_calls \
                  -in_calls \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/k16bask5/stgc4d68866-6d14-477f-9df0-0af9a63354e0/adaptor_calls.jsonl \
                  -logfile \
                  postproc_calls.log \
                  -input_mapping \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/k16bask5/stg501c5f86-8406-4e66-84fe-ca41f84a4df4/seq_mapping.jsonl \
                  -out_file \
                  combined.calls.jsonl
              [job postproc_calls] completed success
              [step postproc_calls] completed success
              [workflow post_processor] starting step log_4
              [step log_4] start
              [job log_4] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/vpgh8p86$ cxxlog2pb \
                  --stage \
                  PostProcessCalls < /tmp/tmpz8it333k/job_working_directory/000/3/tmp/eka02pmi/stg9e59670a-ed19-425e-a8dc-396b061e2f6b/postproc_calls.log > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/vpgh8p86/3bc8758dc9026d571fb8c6b8383da3db38612251
              [job log_4] Max memory used: 39MiB
              [job log_4] completed success
              [step log_4] completed success
              [workflow post_processor] completed success
              [step post_processor] completed success
              [workflow ] starting step collect_logs
              [step collect_logs] start
              [job collect_logs] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/fswa3sn6$ cat \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/hjca6l3r/stg09b05ba7-cd49-47c4-b134-7af6f0b5d1f6/validate_fasta.log \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/hjca6l3r/stg13d2ea12-3d5f-4d5c-9d13-96dbce4d0b16/par_sec_logs.log \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/hjca6l3r/stge834eba0-1b70-4363-8009-702606ce2269/3bc8758dc9026d571fb8c6b8383da3db38612251 > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/fswa3sn6/logs.jsonl
              [job collect_logs] completed success
              [step collect_logs] completed success
              [workflow ] starting step GenerateCleanedFasta
              [step GenerateCleanedFasta] start
              [workflow GenerateCleanedFasta] start
              [workflow GenerateCleanedFasta] starting step prepare_xml_step
              [step prepare_xml_step] start
              [job prepare_xml_step] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/4s70y_p9$ pbcalls2seqtransform \
                  --skipped \
                  skipped_trims.jsonl < /tmp/tmpz8it333k/job_working_directory/000/3/tmp/9eb8i7n_/stg29ec179c-0d85-4193-ac63-dec8f8ffedba/combined.calls.jsonl > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/4s70y_p9/fcs_calls.xml
              [job prepare_xml_step] Max memory used: 42MiB
              [job prepare_xml_step] completed success
              [step prepare_xml_step] completed success
              [workflow GenerateCleanedFasta] starting step seqtransform_step
              [step seqtransform_step] start
              [job seqtransform_step] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/2d0ucm9t$ seqtransform \
                  -out \
                  validated.fna_0.cleaned_fa \
                  -in \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/0j9sbl9x/stg0775e584-b536-41d0-8dcb-f5271271ff05/validated.fna_0.fna \
                  -seqaction-xml-file \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/0j9sbl9x/stg388d3920-d814-4bd0-b1db-81a9e48415dc/fcs_calls.xml \
                  -report \
                  seqtransform.log
              [job seqtransform_step] completed success
              [step seqtransform_step] completed success
              [workflow GenerateCleanedFasta] completed success
              [step GenerateCleanedFasta] completed success
              [workflow ] starting step all_cleaned_fasta
              [step all_cleaned_fasta] start
              [step all_cleaned_fasta] completed success
              [workflow ] starting step GenerateReport
              [step GenerateReport] start
              [workflow GenerateReport] start
              [workflow GenerateReport] starting step log_step
              [step log_step] start
              [job log_step] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/72utog01$ log_jl2tsv \
                  --infile \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/bry98mjb/stg0cfe969b-aa42-43e0-a2ec-548a2383c50a/logs.jsonl \
                  --outfile \
                  fcs.log
              [job log_step] Max memory used: 16MiB
              [job log_step] completed success
              [step log_step] completed success
              [workflow GenerateReport] starting step calls_step
              [step calls_step] start
              [job calls_step] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/fswb0nfv$ pbcalls2tsv < /tmp/tmpz8it333k/job_working_directory/000/3/tmp/0z3582y8/stg8182ffb7-249d-4705-ad7b-189374414540/combined.calls.jsonl > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/fswb0nfv/fcs_adaptor_report.txt
              [job calls_step] Max memory used: 17MiB
              [job calls_step] completed success
              [step calls_step] completed success
              [workflow GenerateReport] completed success
              [step GenerateReport] completed success
              [workflow ] starting step all_skipped_trims
              [step all_skipped_trims] start
              [job all_skipped_trims] /tmp/tmpz8it333k/job_working_directory/000/3/tmp/nj27nqo8$ cat \
                  /tmp/tmpz8it333k/job_working_directory/000/3/tmp/kkr2ac8u/stgdb3b5422-4e17-4ab8-9b58-6f0f98d5fcef/skipped_trims.jsonl > /tmp/tmpz8it333k/job_working_directory/000/3/tmp/nj27nqo8/skipped_trims.jsonl
              [job all_skipped_trims] completed success
              [step all_skipped_trims] completed success
              [workflow ] completed success
              

            Standard Output:

            • Output will be placed in: /tmp/tmpz8it333k/job_working_directory/000/3/working
              Executing the workflow
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "5742fb26881a11f0ac697ced8d2915de"
              advanced {"optional_log": null}
              chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              tax "--euk"
      • Step 8: toolshed.g2.bx.psu.edu/repos/iuc/compose_text_param/compose_text_param/0.1.1:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • cd ../; python _evaluate_expression_.py

            Exit Code:

            • 0

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "5742fb26881a11f0ac697ced8d2915de"
              chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              components [{"__index__": 0, "param_type": {"__current_case__": 0, "component_value": "Species : ", "select_param_type": "text"}}, {"__index__": 1, "param_type": {"__current_case__": 0, "component_value": "Homo Sapiens", "select_param_type": "text"}}, {"__index__": 2, "param_type": {"__current_case__": 0, "component_value": ". Assembly: ", "select_param_type": "text"}}, {"__index__": 3, "param_type": {"__current_case__": 0, "component_value": "Hg19", "select_param_type": "text"}}, {"__index__": 4, "param_type": {"__current_case__": 0, "component_value": " ", "select_param_type": "text"}}, {"__index__": 5, "param_type": {"__current_case__": 0, "component_value": "Haplotype 1", "select_param_type": "text"}}]
              dbkey "?"
      • Step 9: Unlabelled step:

        • step_state: scheduled

        • Subworkflow Steps
          • Step 1: Adaptor Action report:

            • step_state: scheduled
          • Step 2: wc_gnu:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • echo "#lines" > /tmp/tmpz8it333k/job_working_directory/000/5/outputs/dataset_f275d654-21a7-4b39-b642-f29f2dbce358.dat &&  cat '/tmp/tmpz8it333k/files/4/b/8/dataset_4b867762-bf09-4d7e-be7f-031b3037accf.dat' | wc -l | awk '{ print $1 }' >> /tmp/tmpz8it333k/job_working_directory/000/5/outputs/dataset_f275d654-21a7-4b39-b642-f29f2dbce358.dat

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "5742fb27881a11f0ac697ced8d2915de"
                  chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  include_header true
                  options ["lines"]
          • Step 11: toolshed.g2.bx.psu.edu/repos/bgruening/text_processing/tp_cat/9.5+galaxy0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cat '/tmp/tmpz8it333k/files/4/9/c/dataset_49ce81d7-d6f1-4525-a81f-43a466ae6da0.dat' >> '/tmp/tmpz8it333k/job_working_directory/000/14/outputs/dataset_8ad560ef-9c94-412a-871e-a7e982b06299.dat' && cat '/tmp/tmpz8it333k/files/3/5/1/dataset_351ca60c-81fb-4939-b63f-94d202bd2827.dat' >> '/tmp/tmpz8it333k/job_working_directory/000/14/outputs/dataset_8ad560ef-9c94-412a-871e-a7e982b06299.dat' && cat '/tmp/tmpz8it333k/files/1/e/d/dataset_1eda8d9b-b5b9-43fe-ab09-9e24a5584711.dat' >> '/tmp/tmpz8it333k/job_working_directory/000/14/outputs/dataset_8ad560ef-9c94-412a-871e-a7e982b06299.dat' && exit 0

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "5742fb27881a11f0ac697ced8d2915de"
                  chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  queries [{"__index__": 0, "inputs2": {"values": [{"id": 11, "src": "hda"}]}}, {"__index__": 1, "inputs2": {"values": [{"id": 12, "src": "hda"}]}}]
          • Step 12: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "5742fb27881a11f0ac697ced8d2915de"
                  chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 15, "src": "hda"}]}}, {"__index__": 1, "value": {"values": [{"id": 3, "src": "hda"}]}}]}}
          • Step 3: Show tail1:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • set -eo pipefail; ( cat '/tmp/tmpz8it333k/files/f/2/7/dataset_f275d654-21a7-4b39-b642-f29f2dbce358.dat' | tail -n 1 ) > '/tmp/tmpz8it333k/job_working_directory/000/6/outputs/dataset_e7fd136a-e204-410e-8c6d-ae4a3b3cedd0.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "5742fb27881a11f0ac697ced8d2915de"
                  chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  header false
                  lineNum "1"
          • Step 4: param_value_from_file:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "5742fb27881a11f0ac697ced8d2915de"
                  chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  param_type "integer"
                  remove_newlines true
          • Step 5: toolshed.g2.bx.psu.edu/repos/iuc/map_param_value/map_param_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "5742fb27881a11f0ac697ced8d2915de"
                  chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  input_param_type {"__current_case__": 1, "input_param": "4", "mappings": [{"__index__": 0, "from": "1", "to": "False"}], "type": "integer"}
                  output_param_type "boolean"
                  unmapped {"__current_case__": 2, "default_value": "True", "on_unmapped": "default"}
          • Step 6: Select middle adaptors:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • python '/tmp/tmpz8it333k/galaxy-dev/tools/stats/filtering.py' '/tmp/tmpz8it333k/files/4/b/8/dataset_4b867762-bf09-4d7e-be7f-031b3037accf.dat' '/tmp/tmpz8it333k/job_working_directory/000/9/outputs/dataset_db6c3c8f-16e8-413b-a96f-1f5c7a548c97.dat' '/tmp/tmpz8it333k/job_working_directory/000/9/configs/tmpgcfg0726' 5 "str,int,str,str,str" 1

                Exit Code:

                • 0

                Standard Output:

                • Filtering with int(c4.split('..')[0])>100 and int(c4.split('..')[1])<int(c2)-100, 
                  kept 50.00% of 4 valid lines (4 total lines).
                  Skipped 1 invalid line(s) starting at line #4: "seq_00018	522	ACTION_EXCLUDE		CONTAMINATION_SOURCE_TYPE_ADAPTOR:NGB00360.1:Illumina PCR Primer"
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "5742fb27881a11f0ac697ced8d2915de"
                  chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  cond "int(c4.split('..')[0])>100 and int(c4.split('..')[1])<int(c2)-100"
                  dbkey "?"
                  header_lines "1"
          • Step 7: Select end adaptors:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • python '/tmp/tmpz8it333k/galaxy-dev/tools/stats/filtering.py' '/tmp/tmpz8it333k/files/4/b/8/dataset_4b867762-bf09-4d7e-be7f-031b3037accf.dat' '/tmp/tmpz8it333k/job_working_directory/000/10/outputs/dataset_351ca60c-81fb-4939-b63f-94d202bd2827.dat' '/tmp/tmpz8it333k/job_working_directory/000/10/configs/tmp4nbgczyx' 5 "str,int,str,str,str" 0

                Exit Code:

                • 0

                Standard Output:

                • Filtering with int(c4.split('..')[0])<=100 or int(c4.split('..')[1])>=int(c2)-100, 
                  kept 33.33% of 3 valid lines (4 total lines).
                  Skipped 1 invalid line(s) starting at line #4: "seq_00018	522	ACTION_EXCLUDE		CONTAMINATION_SOURCE_TYPE_ADAPTOR:NGB00360.1:Illumina PCR Primer"
                  Skipped 1 comment (starting with #) or blank line(s)
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "5742fb27881a11f0ac697ced8d2915de"
                  chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  cond "int(c4.split('..')[0])<=100 or int(c4.split('..')[1])>=int(c2)-100"
                  dbkey "?"
                  header_lines "0"
          • Step 8: Select sequences to exclude:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • python '/tmp/tmpz8it333k/galaxy-dev/tools/stats/filtering.py' '/tmp/tmpz8it333k/files/4/b/8/dataset_4b867762-bf09-4d7e-be7f-031b3037accf.dat' '/tmp/tmpz8it333k/job_working_directory/000/11/outputs/dataset_1eda8d9b-b5b9-43fe-ab09-9e24a5584711.dat' '/tmp/tmpz8it333k/job_working_directory/000/11/configs/tmpjrnjz8wu' 5 "str,int,str,str,str" 0

                Exit Code:

                • 0

                Standard Output:

                • Filtering with c4=="", 
                  kept 33.33% of 3 valid lines (4 total lines).
                  Skipped 1 comment (starting with #) or blank line(s)
                  

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "tabular"
                  __workflow_invocation_uuid__ "5742fb27881a11f0ac697ced8d2915de"
                  chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  cond "c4==\"\""
                  dbkey "?"
                  header_lines "0"
          • Step 9: toolshed.g2.bx.psu.edu/repos/iuc/pick_value/pick_value/0.2.0:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • cd ../; python _evaluate_expression_.py

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "5742fb27881a11f0ac697ced8d2915de"
                  chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  style_cond {"__current_case__": 0, "pick_style": "first", "type_cond": {"__current_case__": 4, "param_type": "data", "pick_from": [{"__index__": 0, "value": {"values": [{"id": 10, "src": "hda"}]}}]}}
          • Step 10: Replace trimming by masking:

            • step_state: scheduled

            • Jobs
              • Job 1:

                • Job state is ok

                Command Line:

                • awk -v OFS="\t" -v FS="\t" --re-interval --sandbox '{ $3 = gensub( /ACTION_TRIM/, "FIX", "g", $3 ) ; print $0 ; }' '/tmp/tmpz8it333k/files/d/b/6/dataset_db6c3c8f-16e8-413b-a96f-1f5c7a548c97.dat' > '/tmp/tmpz8it333k/job_working_directory/000/13/outputs/dataset_49ce81d7-d6f1-4525-a81f-43a466ae6da0.dat'

                Exit Code:

                • 0

                Traceback:

                Job Parameters:

                • Job parameter Parameter value
                  __input_ext "input"
                  __workflow_invocation_uuid__ "5742fb27881a11f0ac697ced8d2915de"
                  chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
                  dbkey "?"
                  replacements [{"__index__": 0, "column": "3", "find_pattern": "ACTION_TRIM", "replace_pattern": "FIX"}]
      • Step 10: toolshed.g2.bx.psu.edu/repos/iuc/ncbi_fcs_gx/ncbi_fcs_gx/0.5.5+galaxy0:

        • step_state: scheduled

        • Jobs
          • Job 1:

            • Job state is ok

            Command Line:

            • GX_NUM_CORES=${GALAXY_SLOTS:-2} gx clean-genome --input '/tmp/tmpz8it333k/files/7/b/e/dataset_7be46a3b-fdb6-4a03-b3b9-a9c6032bb38d.dat' --action-report '/tmp/tmpz8it333k/files/8/a/d/dataset_8ad560ef-9c94-412a-871e-a7e982b06299.dat' --contam-fasta-out 'contam.fa' --min-seq-len '200' --output 'clean.fa'

            Exit Code:

            • 0

            Standard Error:

            • Applied 3 actions; 2 seqs dropped; 580 bps dropped; 0 bps lowercased; 58 bps hardmasked.
              

            Traceback:

            Job Parameters:

            • Job parameter Parameter value
              __input_ext "input"
              __workflow_invocation_uuid__ "5742fb26881a11f0ac697ced8d2915de"
              chromInfo "/tmp/tmpz8it333k/galaxy-dev/tool-data/shared/ucsc/chrom/?.len"
              dbkey "?"
              mode {"__current_case__": 1, "action_report": {"values": [{"id": 16, "src": "hda"}]}, "input": {"values": [{"id": 2, "src": "hda"}]}, "min_seq_len": "200", "mode_selector": "clean"}
    • Other invocation details
      • history_id

        • 42e717782d3ade90
      • history_state

        • error
      • invocation_id

        • 42e717782d3ade90
      • invocation_state

        • scheduled
      • workflow_id

        • fa7c723f2b9921c9

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants