Memory issue with normalisation in HUMANn3

After merging humann3 output into a merged_genefamilies.tsv file, I was trying to do the normalisation step with humann_renorm_table function. The file had 1004 samples, 25G, and i used memory = 256G , still ran out-of-memory. Would anyone suggest anything to do with it? Thanks!

I generally recommend avoiding merging all of the UniRef profiles into a single table since the resulting file is so large and sparse. You can instead normalize the UniRef profiles separately, regroup them into broader categories, and then merge those tables for statistical analysis.

If you want to work at the UniRef level, you can make a version of the file with the strata removed and just merge the community totals, which should substantially reduce the file size.

Thanks for the advice!