I have been having an immense amount of trouble being able to download the full_chocophlan.v201901_v31.tar.gz database for humann3. Everything else has gone well, but every time I use the command;
humann_databases --download chocophlan full path/to/directory
I eventually get an error. The thing is that the download actually does progress, but eventually it seems like the system gives up and i get the following error;
I have tried downloading it directly from the http://huttenhower.sph.harvard.edu website, but that never works for me either. The download slows to a crawl and never progresses, for some reason.
Any help in how to obtain the database would be much appreciated.
Hi Imadh, Sorry to hear you are having issues with downloading the database from our server. Would you try one more time? Occasionally our server is under a bit of a load which can make downloads take longer. Hopefully when you try again it will all work out okay. If not, please post again.
Hi @Rakesh_Chouhan , Sorry to hear you are having issues with the download. I am glad to hear you finally got it installed. We have been working on migrating our downloads to a new provider. Hopefully they should be online and stable now. Also they should be faster!
humann v3.7 and MetaPhlAn version 4.0.0 (22 Aug 2022)
Hi, I had downloaded chocophlan and uniref databases October 3rd and they were successful and downloaded in ~40 min. Since then, my databases were deleted by cluster maintenance and I have been unsuccessful in downloading them again for the past 2 days. The command generates the folders for the databases but does not give any output or error message, and does not download any files (currently I had both of them running for 24h).
code:
humann_databases --download chocophlan full $INSTALL_LOCATION
humann_databases --download uniref uniref90_diamond $INSTALL_LOCATION
Is there a problem with the new provider? I appreciate your help, thanks!!
We just recently re-hosted all of our downloads, and there was a period where the downloads were not available. If this issue is still persisting please let us know.
With
humann_databases --download chocophlan full $INSTALL_LOCATION
I don’t get an error message or output.
If trying with wget I get this error message:
The admin from Duke Computing Cluster suggests there could be a security block to access your new server. I appreciate if you have any updates on this topic, I am still trying to download the databases (chocophlan and uniref databases). For chocophlan, I am trying with:
conda activate humann
humann_databases --download chocophlan full /hpc/group/humanndatabase
Maybe there is something else we have to specify now? Thank you!
Hi @Emilia , I just wanted to check in to see if you were able to download the databases. If not, it would be great to know what security blocks your admin is seeing. We should not have any on our end but it would be great to know if you are experiencing any issues (or seeing any blocks) so we can get them fixed!
If you prefer using the command line, try running the humann_databases command again. Sometimes, transient network issues can cause intermittent failures.
humann_databases --download chocophlan full path/to/directory
Thank you @gjordaopiedade I manually downloaded the databases and updated the config file.
now I am facing the next problem i.e., “ERROR: You are using the demo utility database with a non-demo input file. If you have not already done so, please run humann_databases to download the full utility database. If you have downloaded the full database, use the option --utility-database to provide the location. You can also run humann_config to update the default database location. For additional information, please see the HUMAnN User Manual.”
I have all databases (not the demo ones but full databases) download to a directory but I am getting the error on the following commands
UPDATE
‘’’
humann3 --input SAMN08516737_1.sam --output …/gene_seqs --nucleotide-database …/…/…/…/…/reference/reference_database/humann3/chocophlan --protein-database …/…/…/…/…/reference/reference_database/humann3/uniref --threads 16 --utility-database …/…/…/…/…/reference/reference_database/humann3
Output files will be written to: /xdisk/syedzaidi/Metagenome_bstPrac/MicrobiomeBestPracticeReview/Metagenomics_analysis/Test_Metagenomic_1/analysis/Gene_based_analysis_onContigs/functional_classification/gene_catalogue/gene_seqs
Process the sam mapping results …
Computing gene families …
Traceback (most recent call last):
File “/home/u13/syedzaidi/anaconda3/envs/MG_analysis2/bin/humann3”, line 33, in
sys.exit(load_entry_point(‘humann==4.0’, ‘console_scripts’, ‘humann3’)())
File “/home/u13/syedzaidi/anaconda3/envs/MG_analysis2/lib/python3.7/site-packages/humann/humann.py”, line 1102, in main
families_file=families.gene_families(alignments,gene_scores,unaligned_reads_count)
File “/home/u13/syedzaidi/anaconda3/envs/MG_analysis2/lib/python3.7/site-packages/humann/quantify/families.py”, line 45, in gene_families
total_all_scores_normalization=alignments.convert_alignments_to_gene_scores(gene_scores,config.count_normalization)
File “/home/u13/syedzaidi/anaconda3/envs/MG_analysis2/lib/python3.7/site-packages/humann/store.py”, line 500, in convert_alignments_to_gene_scores
total_all_scores_normalization=1/total_all_scores*1e6
ZeroDivisionError: division by zero
‘’’