You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am running into an issue with the generation of the MultiQC report in the bacass pipeline. This process runs into an "input file name collision" error and is unable to complete the report. Specifically, this error appears to occur when processing multiple samples with hybrid or long-read assemblies, but it does not occur when running a single hybrid or long-read assembly or when processing single or multiple samples with short-read assemblies. I have attached the nextflow log file, parameter file, and sample sheet below of an example run that ran into this error.
Command used and terminal output
nextflow run 'https://github.com/nf-core/bacass' -name assembly_5706_2 -params-file 'https://api.cloud.seqera.io/ephemeral/2xKzdXhAmBlQi4GLwrdd9Q.json' -with-tower -r 2.3.1 -profile dockerError executing process > 'NFCORE_BACASS:BACASS:MULTIQC_CUSTOM (1)'Caused by: Process `NFCORE_BACASS:BACASS:MULTIQC_CUSTOM` input file name collision -- There are multiple input files for each of the following file names: nanoplot/NanoStats_post_filtering.txt
Description of the bug
I am running into an issue with the generation of the MultiQC report in the bacass pipeline. This process runs into an "input file name collision" error and is unable to complete the report. Specifically, this error appears to occur when processing multiple samples with hybrid or long-read assemblies, but it does not occur when running a single hybrid or long-read assembly or when processing single or multiple samples with short-read assemblies. I have attached the nextflow log file, parameter file, and sample sheet below of an example run that ran into this error.
Command used and terminal output
Relevant files
nf-3DxU8NpOkvS2iG.log
5706_samples.csv
assembly_5706_2-params.json
System information
The text was updated successfully, but these errors were encountered: