You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for the fantastic method for viral quantification and sharing various analyses notebooks.
I am trying to apply your method to a large set of human 10x scRNAseq datasets of ~ 50 Billion reads in order to detect and quantify viral sequences from palmDB. I am using the option 7 i.e. Capturing host reads before alignment to palmDB. During the alignment step, it took around 50-60 hours for the host reads to align and is taking way longer for the viral reads to align to PalmDB.
Currently, it is running for over 4 days and has processed close to 25% of total 50B reads. Is there a way to speed up the alignment for viral reads or is it expected to take this long for alignment to palmDB given the large amount of 10xV3 sequencing reads?
Here is the command I used for aligning the reads to palmDB :
Hi,
Thank you for the fantastic method for viral quantification and sharing various analyses notebooks.
I am trying to apply your method to a large set of human 10x scRNAseq datasets of ~ 50 Billion reads in order to detect and quantify viral sequences from palmDB. I am using the option 7 i.e. Capturing host reads before alignment to palmDB. During the alignment step, it took around 50-60 hours for the host reads to align and is taking way longer for the viral reads to align to PalmDB.
Currently, it is running for over 4 days and has processed close to 25% of total 50B reads. Is there a way to speed up the alignment for viral reads or is it expected to take this long for alignment to palmDB given the large amount of 10xV3 sequencing reads?
Here is the command I used for aligning the reads to palmDB :
If there is any way to make it faster that would be really helpful to speed up this viral alignment, thank you so much for your help!
The text was updated successfully, but these errors were encountered: