I have created a text file that contains about 20,000 urls. I want to download and save the html of each url. The point is that I have to organize the work in a folder: each folder must contain the html of 50 urls. Can you help me?
Asked
Active
Viewed 18 times
0
-
Welcome to Stack Overflow! Visit the [help], take the [tour] to see what and [ask]. Please first ***>>>[Search for related topics on SO](https://www.google.com/search?q=scrape+urls+site%3Astackoverflow.com)<<]`](https://meta.stackoverflow.com/questions/358992/ive-been-told-to-create-a-runnable-example-with-stack-snippets-how-do-i-do) snippet editor. – mplungjan Nov 03 '21 at 14:03