-
Notifications
You must be signed in to change notification settings - Fork 9
Open
Description
Hi BURST devs,
I'm currently trying to make a BURST database from GTDB (fasta size >100GB) for a shotgun metagenomics pipeline, and I'm currently dealing with some memory issues. I've read in other posts on this page that the --db option can be used to reduce the memory requirements, but on the BURST help screen, it says that this can be "lossy", but I'm not exactly sure what is being lost here. Would it be possible to get this clarified? Whatever is being lost, would I be "losing" more by splitting the database into more partitions?
Thanks so much!
Metadata
Metadata
Assignees
Labels
No labels