Since yesterday that I have been trying to download the humann databases, but always get a critical error. I was wondering if it is a known issue or if I am potentially doing something wrong.
humann_databases --download chocophlan full /projects/0/gusr0506/goncalo/databases/HUMAnN --update-config yes
Creating subdirectory to install database: /projects/0/gusr0506/goncalo/databases/HUMAnN/chocophlan
Download URL: http://huttenhower.sph.harvard.edu/humann_data/chocophlan/full_chocophlan.v201901_v31.tar.gz
CRITICAL ERROR: Unable to download and extract from URL: http://huttenhower.sph.harvard.edu/humann_data/chocophlan/full_chocophlan.v201901_v31.tar.gz
I also noticed that when I visit the huttenhower lab page I get a invalid certificate warning:
Your connection is not private
Attackers might be trying to steal your information from huttenhower.sph.harvard.edu (for example, passwords, messages, or credit cards). Learn more
I guess your server is back online.
I still couldn’t use humann_databases --download, but wget now works.
For those out there also struggling to download the database, I did this:
mkdir chocophlan
cd chocophlan
wget --no-check-certificate http://huttenhower.sph.harvard.edu/humann_data/chocophlan/full_chocophlan.v201901_v31.tar.gz
tar -xf full_chocophlan.v201901_v31.tar.gz
humann_config --update database_folders nucleotide /FULL/PATH/chocophlan
And, same for uniref90:
mkdir uniref
cd uniref
wget --no-check-certificate https://huttenhower.sph.harvard.edu/humann_data/uniprot/uniref_annotated/uniref90_annotated_v201901b_full.tar.gz
tar -xf uniref90_annotated_v201901b_full.tar.gz
humann_config --update database_folders protein /FULL/PATH/uniref
You can download it manually and extract it into a folder that you want (there was some safety issues for me when I was trying to download it manually and you have to give the permission)
We get a similar timeout with curl (we thought the issue could be following redirects). Checked wget and curl worked with a number of different downloads and they did. Also checked that we could download the database on a local Macbook with:
It works with the Macbook, so I thought it could be a wget version specific bug. Updated on server (red hat) to wget 1.21.3 but no change in behavior. As others have pointed I can push the local files up to the server, but this is not very practical for larger databases. Any ideas on a more permanent solution?