Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Template update for nf-core/tools version 3.1.1 #15

Open
wants to merge 2 commits into
base: dev
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .editorconfig
Original file line number Diff line number Diff line change
Expand Up @@ -31,3 +31,7 @@ indent_size = unset
# ignore python and markdown
[*.{py,md}]
indent_style = unset

# ignore ro-crate metadata files
[**/ro-crate-metadata.json]
insert_final_newline = unset
1 change: 0 additions & 1 deletion .github/ISSUE_TEMPLATE/bug_report.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@ description: Report something that is broken or incorrect
labels:
- bug
body:

- type: textarea
id: description
attributes:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ jobs:
profile: "singularity"
steps:
- name: Check out pipeline code
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4

- name: Set up Nextflow
uses: nf-core/setup-nextflow@v2
Expand Down
49 changes: 30 additions & 19 deletions .github/workflows/download_pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ name: Test successful pipeline download with 'nf-core pipelines download'

# Run the workflow when:
# - dispatched manually
# - when a PR is opened or reopened to master branch
# - when a PR is opened or reopened to main/master branch
# - the head branch of the pull request is updated, i.e. if fixes for a release are pushed last minute to dev.
on:
workflow_dispatch:
Expand All @@ -17,25 +17,31 @@ on:
- edited
- synchronize
branches:
- main
- master
pull_request_target:
branches:
- main
- master

env:
NXF_ANSI_LOG: false

jobs:
download:
configure:
runs-on: ubuntu-latest
outputs:
REPO_LOWERCASE: ${{ steps.get_repo_properties.outputs.REPO_LOWERCASE }}
REPOTITLE_LOWERCASE: ${{ steps.get_repo_properties.outputs.REPOTITLE_LOWERCASE }}
REPO_BRANCH: ${{ steps.get_repo_properties.outputs.REPO_BRANCH }}
steps:
- name: Install Nextflow
uses: nf-core/setup-nextflow@v2

- name: Disk space cleanup
uses: jlumbroso/free-disk-space@54081f138730dfa15788a46383842cd2f914a1be # v1.3.1

- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d # v5
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b # v5
with:
python-version: "3.12"
architecture: "x64"
Expand All @@ -51,65 +57,70 @@ jobs:
pip install git+https://github.com/nf-core/tools.git@dev

- name: Get the repository name and current branch set as environment variable
id: get_repo_properties
run: |
echo "REPO_LOWERCASE=${GITHUB_REPOSITORY,,}" >> ${GITHUB_ENV}
echo "REPOTITLE_LOWERCASE=$(basename ${GITHUB_REPOSITORY,,})" >> ${GITHUB_ENV}
echo "REPO_BRANCH=${{ github.event.inputs.testbranch || 'dev' }}" >> ${GITHUB_ENV}
echo "REPO_LOWERCASE=${GITHUB_REPOSITORY,,}" >> "$GITHUB_OUTPUT"
echo "REPOTITLE_LOWERCASE=$(basename ${GITHUB_REPOSITORY,,})" >> "$GITHUB_OUTPUT"
echo "REPO_BRANCH=${{ github.event.inputs.testbranch || 'dev' }}" >> "$GITHUB_OUTPUT"

- name: Make a cache directory for the container images
run: |
mkdir -p ./singularity_container_images

download:
runs-on: ubuntu-latest
needs: configure
steps:
- name: Download the pipeline
env:
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
run: |
nf-core pipelines download ${{ env.REPO_LOWERCASE }} \
--revision ${{ env.REPO_BRANCH }} \
--outdir ./${{ env.REPOTITLE_LOWERCASE }} \
nf-core pipelines download ${{ needs.configure.outputs.REPO_LOWERCASE }} \
--revision ${{ needs.configure.outputs.REPO_BRANCH }} \
--outdir ./${{ needs.configure.outputs.REPOTITLE_LOWERCASE }} \
--compress "none" \
--container-system 'singularity' \
--container-library "quay.io" -l "docker.io" -l "community.wave.seqera.io" \
--container-library "quay.io" -l "docker.io" -l "community.wave.seqera.io/library/" \
--container-cache-utilisation 'amend' \
--download-configuration 'yes'

- name: Inspect download
run: tree ./${{ env.REPOTITLE_LOWERCASE }}
run: tree ./${{ needs.configure.outputs.REPOTITLE_LOWERCASE }}

- name: Count the downloaded number of container images
id: count_initial
run: |
image_count=$(ls -1 ./singularity_container_images | wc -l | xargs)
echo "Initial container image count: $image_count"
echo "IMAGE_COUNT_INITIAL=$image_count" >> ${GITHUB_ENV}
echo "IMAGE_COUNT_INITIAL=$image_count" >> "$GITHUB_OUTPUT"

- name: Run the downloaded pipeline (stub)
id: stub_run_pipeline
continue-on-error: true
env:
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
NXF_SINGULARITY_HOME_MOUNT: true
run: nextflow run ./${{ env.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ env.REPO_BRANCH }}) -stub -profile test,singularity --outdir ./results
run: nextflow run ./${{needs.configure.outputs.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ needs.configure.outputs.REPO_BRANCH }}) -stub -profile test,singularity --outdir ./results
- name: Run the downloaded pipeline (stub run not supported)
id: run_pipeline
if: ${{ job.steps.stub_run_pipeline.status == failure() }}
if: ${{ steps.stub_run_pipeline.outcome == 'failure' }}
env:
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
NXF_SINGULARITY_HOME_MOUNT: true
run: nextflow run ./${{ env.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ env.REPO_BRANCH }}) -profile test,singularity --outdir ./results
run: nextflow run ./${{ needs.configure.outputs.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ needs.configure.outputs.REPO_BRANCH }}) -profile test,singularity --outdir ./results

- name: Count the downloaded number of container images
id: count_afterwards
run: |
image_count=$(ls -1 ./singularity_container_images | wc -l | xargs)
echo "Post-pipeline run container image count: $image_count"
echo "IMAGE_COUNT_AFTER=$image_count" >> ${GITHUB_ENV}
echo "IMAGE_COUNT_AFTER=$image_count" >> "$GITHUB_OUTPUT"

- name: Compare container image counts
run: |
if [ "${{ env.IMAGE_COUNT_INITIAL }}" -ne "${{ env.IMAGE_COUNT_AFTER }}" ]; then
initial_count=${{ env.IMAGE_COUNT_INITIAL }}
final_count=${{ env.IMAGE_COUNT_AFTER }}
if [ "${{ steps.count_initial.outputs.IMAGE_COUNT_INITIAL }}" -ne "${{ steps.count_afterwards.outputs.IMAGE_COUNT_AFTER }}" ]; then
initial_count=${{ steps.count_initial.outputs.IMAGE_COUNT_INITIAL }}
final_count=${{ steps.count_afterwards.outputs.IMAGE_COUNT_AFTER }}
difference=$((final_count - initial_count))
echo "$difference additional container images were \n downloaded at runtime . The pipeline has no support for offline runs!"
tree ./singularity_container_images
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/fix-linting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ jobs:
runs-on: ubuntu-latest
steps:
# Use the @nf-core-bot token to check out so we can push later
- uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b # v4
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
token: ${{ secrets.nf_core_bot_auth_token }}

Expand All @@ -32,7 +32,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.nf_core_bot_auth_token }}

# Install and run pre-commit
- uses: actions/setup-python@82c7e631bb3cdc910f68e0081d67478d79c6982d # v5
- uses: actions/setup-python@0b93645e9fea7318ecaed2b359559ac225c90a2b # v5
with:
python-version: "3.12"

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/template_version_comment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Check out pipeline code
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b # v4
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
ref: ${{ github.event.pull_request.head.sha }}

Expand Down
11 changes: 2 additions & 9 deletions .gitpod.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,5 @@ tasks:
nextflow self-update

vscode:
extensions: # based on nf-core.nf-core-extensionpack
#- esbenp.prettier-vscode # Markdown/CommonMark linting and style checking for Visual Studio Code
- EditorConfig.EditorConfig # override user/workspace settings with settings found in .editorconfig files
- Gruntfuggly.todo-tree # Display TODO and FIXME in a tree view in the activity bar
- mechatroner.rainbow-csv # Highlight columns in csv files in different colors
- nextflow.nextflow # Nextflow syntax highlighting
- oderwat.indent-rainbow # Highlight indentation level
- streetsidesoftware.code-spell-checker # Spelling checker for source code
- charliermarsh.ruff # Code linter Ruff
extensions:
- nf-core.nf-core-extensionpack # https://github.com/nf-core/vscode-extensionpack
22 changes: 9 additions & 13 deletions .nf-core.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
bump_version: null
lint:
files_exist:
- CODE_OF_CONDUCT.md
Expand All @@ -25,25 +24,22 @@ lint:
- .github/CONTRIBUTING.md
- .github/PULL_REQUEST_TEMPLATE.md
multiqc_config:
- report_comment
- report_comment
nextflow_config:
- manifest.name
- manifest.homePage
- validation.help.beforeText
- validation.help.afterText
- validation.summary.beforeText
- validation.summary.afterText
- manifest.name
- manifest.homePage
- validation.help.beforeText
- validation.help.afterText
- validation.summary.beforeText
- validation.summary.afterText
nf_core_version: 3.1.1
org_path: null
repository_type: pipeline
template:
author: Muhammad Muneeb Nasir
description: Lightning fast & automated metagenomic pipeline powered by Nextflow
force: true
force: false
is_nfcore: false
name: metabolt
org: MDL
outdir: .
skip_features: null
version: '1.0'
update: null
version: "1.0"
3 changes: 3 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
"markdown.styles": ["public/vscode_markdown.css"]
}
2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
MIT License

Copyright (c) Muneeb
Copyright (c) The MDL/metabolt team

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@
## Usage

> [!NOTE]
> If you are new to Nextflow and nf-core, please refer to [this page](https://nf-co.re/docs/usage/installation) on how to set-up Nextflow. Make sure to [test your setup](https://nf-co.re/docs/usage/introduction#how-to-run-a-pipeline) with `-profile test` before running the workflow on actual data.
> If you are new to Nextflow and nf-core, please refer to [this page](https://nf-co.re/docs/usage/installation) on how to set-up Nextflow.Make sure to [test your setup](https://nf-co.re/docs/usage/introduction#how-to-run-a-pipeline) with `-profile test` before running the workflow on actual data.

### Minimum Steps to Execute the Pipeline

Expand Down
2 changes: 1 addition & 1 deletion conf/base.config
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ process {
maxErrors = '-1'

// Process-specific resource requirements
// NOTE - Please try and re-use the labels below as much as possible.
// NOTE - Please try and reuse the labels below as much as possible.
// These labels are used and recognised by default in DSL2 files hosted on nf-core/modules.
// If possible, it would be nice to keep the same label naming convention when
// adding in your local modules too.
Expand Down
32 changes: 14 additions & 18 deletions docs/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,9 +73,8 @@ If you wish to repeatedly use the same parameters for multiple runs, rather than

Pipeline settings can be provided in a `yaml` or `json` file via `-params-file <file>`.

:::warning
Do not use `-c <file>` to specify parameters as this will result in errors. Custom config files specified with `-c` must only be used for [tuning process resource specifications](https://nf-co.re/docs/usage/configuration#tuning-workflow-resources), other infrastructural tweaks (such as output directories), or module arguments (args).
:::
> [!WARNING]
> Do not use `-c <file>` to specify parameters as this will result in errors. Custom config files specified with `-c` must only be used for [tuning process resource specifications](https://nf-co.re/docs/usage/configuration#tuning-workflow-resources), other infrastructural tweaks (such as output directories), or module arguments (args).

The above pipeline run specified with a params file in yaml format:

Expand Down Expand Up @@ -104,40 +103,37 @@ nextflow pull MDL/metabolt

### Reproducibility

It is a good idea to specify a pipeline version when running the pipeline on your data. This ensures that a specific version of the pipeline code and software are used when you run your pipeline. If you keep using the same tag, you'll be running the same version of the pipeline, even if there have been changes to the code since.
It is a good idea to specify the pipeline version when running the pipeline on your data. This ensures that a specific version of the pipeline code and software are used when you run your pipeline. If you keep using the same tag, you'll be running the same version of the pipeline, even if there have been changes to the code since.

First, go to the [MDL/metabolt releases page](https://github.com/MDL/metabolt/releases) and find the latest pipeline version - numeric only (eg. `1.3.1`). Then specify this when running the pipeline with `-r` (one hyphen) - eg. `-r 1.3.1`. Of course, you can switch to another version by changing the number after the `-r` flag.

This version number will be logged in reports when you run the pipeline, so that you'll know what you used when you look back in the future. For example, at the bottom of the MultiQC reports.

To further assist in reproducbility, you can use share and re-use [parameter files](#running-the-pipeline) to repeat pipeline runs with the same settings without having to write out a command with every single parameter.
To further assist in reproducibility, you can use share and reuse [parameter files](#running-the-pipeline) to repeat pipeline runs with the same settings without having to write out a command with every single parameter.

:::tip
If you wish to share such profile (such as upload as supplementary material for academic publications), make sure to NOT include cluster specific paths to files, nor institutional specific profiles.
:::
> [!TIP]
> If you wish to share such profile (such as upload as supplementary material for academic publications), make sure to NOT include cluster specific paths to files, nor institutional specific profiles.

## Core Nextflow arguments

:::note
These options are part of Nextflow and use a _single_ hyphen (pipeline parameters use a double-hyphen).
:::
> [!NOTE]
> These options are part of Nextflow and use a _single_ hyphen (pipeline parameters use a double-hyphen)

### `-profile`

Use this parameter to choose a configuration profile. Profiles can give configuration presets for different compute environments.

Several generic profiles are bundled with the pipeline which instruct the pipeline to use software packaged using different methods (Docker, Singularity, Podman, Shifter, Charliecloud, Apptainer, Conda) - see below.

:::info
We highly recommend the use of Docker or Singularity containers for full pipeline reproducibility, however when this is not possible, Conda is also supported.
:::
> [!IMPORTANT]
> We highly recommend the use of Docker or Singularity containers for full pipeline reproducibility, however when this is not possible, Conda is also supported.

The pipeline also dynamically loads configurations from [https://github.com/nf-core/configs](https://github.com/nf-core/configs) when it runs, making multiple config profiles for various institutional clusters available at run time. For more information and to see if your system is available in these configs please see the [nf-core/configs documentation](https://github.com/nf-core/configs#documentation).
The pipeline also dynamically loads configurations from [https://github.com/nf-core/configs](https://github.com/nf-core/configs) when it runs, making multiple config profiles for various institutional clusters available at run time. For more information and to check if your system is supported, please see the [nf-core/configs documentation](https://github.com/nf-core/configs#documentation).

Note that multiple profiles can be loaded, for example: `-profile test,docker` - the order of arguments is important!
They are loaded in sequence, so later profiles can overwrite earlier profiles.

If `-profile` is not specified, the pipeline will run locally and expect all software to be installed and available on the `PATH`. This is _not_ recommended, since it can lead to different results on different machines dependent on the computer enviroment.
If `-profile` is not specified, the pipeline will run locally and expect all software to be installed and available on the `PATH`. This is _not_ recommended, since it can lead to different results on different machines dependent on the computer environment.

- `test`
- A profile with a complete configuration for automated testing
Expand Down Expand Up @@ -173,13 +169,13 @@ Specify the path to a specific config file (this is a core Nextflow command). Se

### Resource requests

Whilst the default requirements set within the pipeline will hopefully work for most people and with most input data, you may find that you want to customise the compute resources that the pipeline requests. Each step in the pipeline has a default set of requirements for number of CPUs, memory and time. For most of the steps in the pipeline, if the job exits with any of the error codes specified [here](https://github.com/nf-core/rnaseq/blob/4c27ef5610c87db00c3c5a3eed10b1d161abf575/conf/base.config#L18) it will automatically be resubmitted with higher requests (2 x original, then 3 x original). If it still fails after the third attempt then the pipeline execution is stopped.
Whilst the default requirements set within the pipeline will hopefully work for most people and with most input data, you may find that you want to customise the compute resources that the pipeline requests. Each step in the pipeline has a default set of requirements for number of CPUs, memory and time. For most of the pipeline steps, if the job exits with any of the error codes specified [here](https://github.com/nf-core/rnaseq/blob/4c27ef5610c87db00c3c5a3eed10b1d161abf575/conf/base.config#L18) it will automatically be resubmitted with higher resources request (2 x original, then 3 x original). If it still fails after the third attempt then the pipeline execution is stopped.

To change the resource requests, please see the [max resources](https://nf-co.re/docs/usage/configuration#max-resources) and [tuning workflow resources](https://nf-co.re/docs/usage/configuration#tuning-workflow-resources) section of the nf-core website.

### Custom Containers

In some cases you may wish to change which container or conda environment a step of the pipeline uses for a particular tool. By default nf-core pipelines use containers and software from the [biocontainers](https://biocontainers.pro/) or [bioconda](https://bioconda.github.io/) projects. However in some cases the pipeline specified version maybe out of date.
In some cases, you may wish to change the container or conda environment used by a pipeline steps for a particular tool. By default, nf-core pipelines use containers and software from the [biocontainers](https://biocontainers.pro/) or [bioconda](https://bioconda.github.io/) projects. However, in some cases the pipeline specified version maybe out of date.

To use a different container from the default container or conda environment specified in a pipeline, please see the [updating tool versions](https://nf-co.re/docs/usage/configuration#updating-tool-versions) section of the nf-core website.

Expand Down
2 changes: 1 addition & 1 deletion modules.json
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
},
"fastp": {
"branch": "master",
"git_sha": "666652151335353eef2fcd58880bcef5bc2928e1",
"git_sha": "dc94b6ee04a05ddb9f7ae050712ff30a13149164",
"installed_by": ["modules"]
},
"fastqc": {
Expand Down
Loading
Loading