I have 1,200 URLs that point to images. I'd like to save them all to my hard drive. How can I accomplish this with one bulk operation?
Asked
Active
Viewed 5,050 times
3 Answers
0
Check out HTTrack for a graphical interface option. If you don't mind using the command line, wget can accept a text file as input (with the -i switch) containing a list of URLs to download.
-
I just tested the software, I really like what it does, do you know how to keep the original filename? The filename is in the link for example 1250757780.jpg and it's from this link img/i.php?type=i&file=1250757780.jpg – Chris Dec 28 '09 at 17:40
-
In Preferences and mirror options go to the build tab. Check off the box hide query strings. That should do the trick. You could also set your own custom save format on the built tab after clicking options. – Dec 28 '09 at 18:01
-
Indeed, the problem is that it saves names like: i76f3.jpg it should be something like :1250757780.jpg – Chris Dec 28 '09 at 18:07
-
I found good software, it's PicaLoader, this one has a option to save the image with a complete url name – Chris Dec 28 '09 at 19:17
-
-
-
For the record, CocoaWget for Mac provides a graphical interface for some basic wget commands. The current version (2.7) allows you to paste in a list of URLS to the single URL field and it will add each of them to the download queue. – spatical Jun 24 '14 at 07:19
0
In Unix or cygwin, you can use wget to get the contents of a url. I believe something as simple as cat list_of_urls.txt | wget would probably do what you need.