41

I want to download the file that is viewable at this address to a linux remote:

https://drive.google.com/file/d/0Bz7KyqmuGsilT0J5dmRCM0ROVHc/view?usp=sharing

I'm hoping I can do this with wget.

I tried

wget https://drive.google.com/file/d/0Bz7KyqmuGsilT0J5dmRCM0ROVHc/vgg16_weights.h5

and the response was a 404.

Is it possible to wget a google drive file? If so, what is the path to provide? If not, are there any alternatives (bash or other) so I can avoid downloading the file to my local and transferring it to the remote?

Kara
  • 5,996
  • 16
  • 49
  • 56
efbbrown
  • 3,545
  • 6
  • 29
  • 46
  • Read this early answer http://stackoverflow.com/a/25033499/2666859 – Serenity May 26 '16 at 07:12
  • For the information of others, you can see my answer here using CURL (Updated March 2018): https://stackoverflow.com/a/49444877/4043524 – Amit Chahar Mar 27 '18 at 13:25
  • Possible duplicate of [wget/curl large file from google drive](https://stackoverflow.com/questions/25010369/wget-curl-large-file-from-google-drive) – craq Sep 23 '19 at 22:45

4 Answers4

61

Insert your file ID into this URL (https://drive.google.com/uc?export=download&id=), then surround the URL with quotes so that Bash doesn't misinterpret the &, like so:

wget "https://drive.google.com/uc?export=download&id=0Bz7KyqmuGsilT0J5dmRCM0ROVHc"

Reference here.


When downloading big files, Google Drive adds a security warning that breaks the script above. In that case, you can download the file using:

wget --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate 'https://docs.google.com/uc?export=download&id=FILEID' -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=FILEID" -O FILENAME && rm -rf /tmp/cookies.txt

(Script taken from here)

GPhilo
  • 17,233
  • 7
  • 63
  • 83
  • 38
    Google Drive adds a security warning if the file is large. Then the given solution does not work anymore. Does someone has a solution for this? – pltrdy Jan 10 '17 at 14:25
  • 1
    For those using `curl`, the above also works, _e.g._: `curl -L -o file.out 'https://drive.google.com/uc?export=download&id=0Bz7KyqmuGsilT0J5dmRCM0ROVHc'`. Note that the `-L` command-line option is needed for following Google's redirects. – Castaglia Feb 27 '17 at 15:39
  • 4
    Not working even for small files, anymore. gives me a 404 – iratzhash Apr 17 '17 at 12:33
  • @yazfield Try this script: `id=$(echo $1 | cut -d "=" -f2) url=$(echo "https://drive.google.com/uc?export=download&id="$id) wget "$url" -O $2` and run with the original drive link (not the ID, but the full URL) plus an output file name. – nightcod3r Jun 20 '17 at 20:23
  • doesn't work when file has this warning "Exceeds the maximum size that Google can scan. Would you still like to download this file?" For example: https://drive.google.com/a/broadinstitute.org/uc?id=0B60wROKy6OqceTNZRkZnaERWREk&export=download – user553965 Jul 12 '17 at 18:33
  • @user553965 I am also trying to download dbNSFP, but 3.5a. any luck? – jimh Aug 30 '17 at 21:31
  • worked for me https://gist.github.com/iamtekeste/3cdfd0366ebfd2c0d805#gistcomment-2359248 that is thsi answer with mor automated stuffs... just be sure that your file is shared with all and not visibla only to a restrict user group – Luigi Pirelli Jun 13 '19 at 13:40
  • [here](https://medium.com/@acpanjan/download-google-drive-files-using-wget-3c2c025a8b99) is a solution that works for big files as well – GPhilo Mar 21 '22 at 09:30
  • Just use confirm=yes. wget "https://drive.google.com/u/3/uc?id= 0Bz7KyqmuGsilT0J5dmRCM0ROVHc&export=download&confirm=yes" – mircobabini Apr 20 '22 at 20:26
19

The shortest way I have found for downloading big files:

git clone https://github.com/chentinghao/download_google_drive.git
cd download_google_drive/
python download_gdrive.py FILE_ID DESTINATION_PATH

Just view access, FILE_ID and DESTINATION_PATH (including file name) are required. It's working now as of November/12/2019 and has been for more than a year.

Another solution is googleapiclient. It enables you to automate downloading/uploading private files having their ids.

Note: Also make sure that file is publicly shared not with a group or within an organization

Matias Haeussler
  • 953
  • 2
  • 11
  • 23
6

First, click the share button in the top right corner, and set the permission to allow anyone with the link can view.

Click File->Download as-> PDF Document(.pdf) in the left cornel, and start to download by browser.

Find the URL in downloader, chrome is in chrome://downloads/.

The URL link by the time I am writing this answer is https://docs.google.com/document/export?format=pdf&id=xxx&token=xxx&includes_info_params=true

I was able to download as pdf with wget by wget -O xxx.pdf "https://docs.google.com/document/export?format=pdf&id=xxx"

hailinzeng
  • 928
  • 8
  • 24
3

This works for me: https://www.matthuisman.nz/2019/01/download-google-drive-files-wget-curl.html Google shows warning if file is larger then 100MB, so this case should be handled separately. Author shows how to do that, and provides script to automate it: https://github.com/matthuisman/gdrivedl

al.zatv
  • 135
  • 12