-1

I'm currently attempting to migrate a large amount of images to another site. So, I must first download them. The file name of the images increases by "1". Quite literally, the files are called 1.jpg, 2.jpg, 3.jpg, etc.

How would I go about doing this in the most efficient way as possible? There are thousands of images!

Additionally, I do not have access to the server. Meaning, I cannot simply select all the images via FTP. Each image is listed on a different page, by itself.

  • 2
    On Stack Overflow, the first step is to try writing some code then ask for help once you encounter a problem with it. As is, your question really seems like you're asking us to write code for you. – the Tin Man Feb 12 '14 at 05:47
  • Also, look at the existing answers like this one: http://stackoverflow.com/questions/6768238/download-an-image-from-a-url – orde Feb 12 '14 at 05:49
  • 1
    Also, if you have to "migrate" the files, you should have authorized access to the files, which would open up all sorts of fast ways to move them. Scraping pages sounds like a backdoor attempt. – the Tin Man Feb 12 '14 at 05:49

1 Answers1

1
mkdir imgs
cd imgs
curl --remote-name-all http://www.example.com/imgs/{1..10000}.jpg
squid314
  • 1,404
  • 8
  • 9