-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fastp process exceeding running time limit #23
Comments
You could try a custom config file with process {
time = 240.h
} To set the default time limit for all processes to 10 days, for example. Run pipeline with config file
|
Thanks I'll try that! If I already have -c set with the villumina-high-mem-centrifuge.config file, can I just add that process for the time to that file so that it will do both? |
Yes, I believe you can specify multiple configs with the same pipeline
analysis.
…On Tue, Jul 13, 2021, 09:36 mnebroski ***@***.***> wrote:
Thanks I'll try that! If I already have -c set with the
villumina-high-mem-centrifuge.config file, can I just add that process for
the time to that file so that it will do both?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#23 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABACHICPBNLQIIRCBEWZFRTTXRFQTANCNFSM5AHCICPQ>
.
|
I have been trying to run the pipeline on a very large dataset (file sizes >25GB each), and I'm running into an error on the fastp step where it says:
Process exceeds running time limit (1h)
In the pipeline summary it says the default max time for the workflow is set to 10d, and I've also tried to play around with different times using the --max_time parameter, but it always fails after 1 hour no matter what the max time is set at. Is there another parameter I should be using to adjust this or a way to increase this time another way? Fastp and the subsequent Kraken2 and Centrifuge processes work on a sample that is ~25GB, but the other samples are much larger (>45GB) and it's on those ones that there seems to be the issue.
The text was updated successfully, but these errors were encountered: