Here is my problem I want to scrape multiple site then enrich my results by asking an API before save them in a database.
My first method was chaining the request after each item scraper like here and launched the spiders in parallel. But the problem was the API rate limit I could not control when the API was called and limit the rate.
My second try was to chain all spiders but I can't get results when chaining (and I could not run multiple scraping in parallel with this method).
The best way would be to run the multiple scraping in parallel and feeding them to the API in a spider to get control over the rate and all. Does anyone have a solution for this including changing framework ?
Thanks for your answers.