HUMAnN database download

Hi,
I thought might be best to open a new issue. Latest database download issue - #5 by gjordaopiedade

Since yesterday that I have been trying to download the humann databases, but always get a critical error. I was wondering if it is a known issue or if I am potentially doing something wrong.

humann_databases --download chocophlan full /projects/0/gusr0506/goncalo/databases/HUMAnN --update-config yes
Creating subdirectory to install database: /projects/0/gusr0506/goncalo/databases/HUMAnN/chocophlan
Download URL: http://huttenhower.sph.harvard.edu/humann_data/chocophlan/full_chocophlan.v201901_v31.tar.gz
CRITICAL ERROR: Unable to download and extract from URL: http://huttenhower.sph.harvard.edu/humann_data/chocophlan/full_chocophlan.v201901_v31.tar.gz

I also noticed that when I visit the huttenhower lab page I get a invalid certificate warning:

Your connection is not private

Attackers might be trying to steal your information from huttenhower.sph.harvard.edu (for example, passwords, messages, or credit cards). Learn more

NET::ERR_CERT_DATE_INVALID

Thanks in advance!
Best,
Gonçalo

1 Like

I guess your server is back online.
I still couldn’t use humann_databases --download, but wget now works.

For those out there also struggling to download the database, I did this:

mkdir chocophlan
cd chocophlan
wget --no-check-certificate http://huttenhower.sph.harvard.edu/humann_data/chocophlan/full_chocophlan.v201901_v31.tar.gz 
tar -xf full_chocophlan.v201901_v31.tar.gz

humann_config --update database_folders nucleotide /FULL/PATH/chocophlan

And, same for uniref90:

mkdir uniref
cd uniref
wget --no-check-certificate https://huttenhower.sph.harvard.edu/humann_data/uniprot/uniref_annotated/uniref90_annotated_v201901b_full.tar.gz
tar -xf uniref90_annotated_v201901b_full.tar.gz

humann_config --update database_folders protein /FULL/PATH/uniref
1 Like

Thanks for your comments here. We did indeed have some website issues that are now resolved.

I am still getting the error. Unable to download and extract from URL: http://huttenhower.sph.harvard.edu/humann_data/chocophlan/full_chocophlan.v201901_v31.tar.gz
When I tried to download. They seem to be not resolved!

Had this issue today with:
humann_databases --download chocophlan full $DATABSE_DIR

wget and tar work to get the database.

You can download it manually and extract it into a folder that you want (there was some safety issues for me when I was trying to download it manually and you have to give the permission)

Following up on this issue. We are experiencing the following behaviors when trying to download the humann3 databases .

HUMAnN3 download methods timeout when passing the following command

humann_databases --download chocophlan full <PATH> --update-config yes 

wget returns Unable to establish SSL connection errors

wget --no-check-certificate http://huttenhower.sph.harvard.edu/humann_data/chocophlan/full_chocophlan.v201901_v31.tar.gz 

We get a similar timeout with curl (we thought the issue could be following redirects). Checked wget and curl worked with a number of different downloads and they did. Also checked that we could download the database on a local Macbook with:

wget --no-check-certificate http://huttenhower.sph.harvard.edu/humann_data/chocophlan/full_chocophlan.v201901_v31.tar.gz 

It works with the Macbook, so I thought it could be a wget version specific bug. Updated on server (red hat) to wget 1.21.3 but no change in behavior. As others have pointed I can push the local files up to the server, but this is not very practical for larger databases. Any ideas on a more permanent solution?