Releases: magicDGS/ReadTools
Version 1.5.2
Bug fix for advance users using the java properties for barcode separators (both sequences and qualities). Usign an special character as separator was not working before, and providing a regexp as introducing it into the barcode separator of output file. This fix corrects the issue and allows to properly use other separators as defaults (e.g., +
) instead of the SAM-spec recommended hyphen (-
).
Recommended to update if you are using the Dreadtools.barcode_index_delimiter
and/or Dreadtools.barcode_quality_delimiter
java properties. Otherwise, the fix does not affect the user.
Version 1.5.1
Bug fix for Casava-formatted FASTQ files with dual indexes. If you have this kind of data, we recommend to update.
Other miscellaneous fixes could be found in the CHANGELOG, mostly related with better error messages and documentation.
Version 1.5.0
Minor release with the following important changes:
- Trimming should have a performance boost
- Support non-local reference files for CRAM (e.g., HDFS or GCS)
See the CHANGELOG for more details.
Version 1.4.1
Bug fix related with using non-local files (e.g., GCS or HDFS). Concretely, input FASTQ files in non-local filesystems would not run with previous versions.
In addition, downloading reads from a Distmap job that does not have the _SUCCESS file will throw an error, because it is an indication that mapping failed.
Version 1.4.0
Minor release with the following important changes:
- BAM support for long reference sequences and long CIGAR (recommended if you are using
ReadTools
with long-read sequencing) - Improved GKL compression (recommended if you use Intel processors)
See the CHANGELOG for more details.
IMPORTANT NOTE: short arguments for read-filters are not available anymore. This is a minor breaking change, but you should change your scripts for TrimReads
if you were using the short version*
Version 1.3.0
Minor release with new features and bug fixes:
QualityEncodingDetector
validation stringency is SILENT by default and can be provided as an argument- Hadoop compression can be customized providing codec extensions in the classpath
- Now it is possible to uncompress the complete jar file in OSX
See the CHANGELOG for more details and changes.
Version 1.2.1
Bug fix related with distmap, but not affecting other parts of the code.
Version 1.2.0
Minor release with some new features:
ReadsToDistmap
support trimming/filtering pipelines. This is useful for upload to an HDFS cluster and perform mapping with distmap, without trimming locally/on-cluster.LibraryReadFilter
support for several libraries (through GATK dependency)- Support input reference files on different file system, such as HDFS/GCS (important for CRAM)
See the CHANGELOG for more details.
IMPORTANT NOTE: if you were using the command line file expansion feature for collection arguments, you should change your input files from .list
extension to .args
. This will change in the future (supporting both extensions)
Version 1.1.0
Minor release with some minor bug fixes and more consistent assumptions for sort order, mostly for FASTQ input data. Important bug fixes includes:
- FASTQ read names with spaces without barcode/pair-end information are now correctly handled.
- Sort order for pair-end data is assumed to be unsorted to be safe, but grouping by query (GO:query)
From new releases of third-party libraries, there are other fixes important for the final user in this release. For example, the GATK 4 beta release used in ReadTools adds better support to Google Cloud Storage (GCS). In addition, some problems coming from the HTSJDK library were solved:
- Major inconsistencies with CRAM formatted files are fixed
- Snappy compression/decompression is widely supported by a new version of the dependency
Version 1.0.0
First ReadTools release.
See CHANGELOG for differences with respect to pre-released versions.