0

I have about 10 url like:

links =

[ "https://example.org/example/example.txt",
  "https://example.org/example/example1.txt",
  "https://example.org/example/example2.txt",
  "https://example.org/example/example3.txt",
  "https://example.org/example/example4.txt",
  "https://example.org/example/example5.txt",
  "https://example.org/example/example6.txt",
  "https://example.org/example/example7.txt",
  "https://example.org/example/example8.txt",
  "https://example.org/example/example9.txt",]

How can i get all the data from these 10 urls at once and write to one file? I used urllib but the data of the following url overwrites the data of the previous url

for url in links:

    try:
        r = requests.get(url)
        data = responses.read()
        with open('abc.txt', 'wb') as fp:
            fp.write(data)
    
    except ValueError:
        pass 
  • Does this answer your question? [How do you append to a file?](https://stackoverflow.com/questions/4706499/how-do-you-append-to-a-file) – Tzane Apr 28 '22 at 07:47

0 Answers0