Custom Database Creation Consumes All Disk Space

I run humann with --bypass-prescreen and --bypass-translated-search because I have MetaPhlAn files from running MetaPhlAn previously.

Output files will be written to: /tmp
Decompressing gzipped file ...
Creating custom ChocoPhlAn database ........
gzip: stdout: No space left on device

$ du -sh /tmp/*
70G     /tmp/OSCC_1-P_unmapped_R1_humann_temp

The custom ChocoPhlAn database seems remarkably large and would be bigger if it could be. Should I be using the options which I am using? Could there be a tutorial written specifically for MetaPhlAn followed by HUMAnN use?

Sorry for the confusion - if you already have a taxonomic profile you want to use, you can pass it to HUMAnN with the --taxonomic-profile flag. The --bypass-prescreen flag is shorthand for “index and profile against the entire ChocoPhlAn database,” which is not recommended unless you have a very specific reason for doing so (in part due to space constraints).