Implement download resumption in `humann_databases`

I tried to follow Step 5 of “Initial Installation”, and ran into an issue with the command humann_databases --download chocophlan full <path>. Namely, after ~15m of downloading the download failed with the following error: CRITICAL ERROR: Unable to download and extract from URL: http://huttenhower.sph.harvard.edu/humann_data/chocophlan/full_chocophlan.v201901_v31.tar.gz. I tried this command 4 times, with each attempt resulting in an error. It appears other users have had similar issues downloading from the default file-hosting URLs: Difficulty downloading databases in humann3 Considering the size of the dataset, and thus the time it takes to download, trying again from scratch seems inefficient.

If possible, it would be good to enable resumption functionality similar to wget -c, where the download utility can pick up from where it was interrupted.

Hello,
Could you manage to download the databases? I am now struggling to do so.

I managed to download the Chocophlan database at the time (I used wget -c as stated in my OP, but that was mainly because my internet connection was unstable at the time) - but now all the database download URLs are giving me 403 errors, implying a problem with the server. Not sure if there are workarounds.

@Subhajeet_Dutta at the time of writing the 403 error is gone on my end, I now seem to be able to download the DBs.