0

I have a list of 500+ url's that I want to interact with in parallel using selenium.

I came across this post on how to do it using joblib: Python parallel execution with selenium

However, after about 4 instances my computer became quite slow as the drivers were consuming a lot of my RAM. The interactions happen at periodic intervals for each url under a while True loop.

Is there a better way to pull this off in a memory efficient manner?

Would it be possible to open all 500+ urls within a single browser instance and then execute upon them in parallel?

I am also getting the following error after running joblib a few times:

Exception has occurred: TerminatedWorkerError (note: full exception trace is shown but execution is paused at: <module>)
A worker process managed by the executor was unexpectedly terminated. This could be caused by a segmentation fault while calling the function or by an excessive memory usage causing the Operating System to kill the worker.
  • 1
    I believe memory usage is a core issue of selenium perhaps https://github.com/seleniumhq/selenium-google-code-issue-archive/issues/4988 or https://stackoverflow.com/questions/41918828/selenium-using-too-much-memory will help you – Kwsswart Oct 27 '21 at 12:39

0 Answers0