0

I have a Django-based web app that is sort of like a biography site with more than 200k pages. Each page is unique of course with proper unique meta. Google has indexed around 270 pages so far in these 20 days. The biggest issue seems to be how often the site is getting crawled. The last time I checked it had less than 60 daily crawl requests. How are guys with a large number of pages indexing thousands of their site pages indexed in weeks? I am even fine with 100 indexes a day but it's not happening at all. What do you guys suggest?

Tookie
  • 1
  • Please see the linked duplicate - some possible strategies are requesting an increased crawl rate, improving navigation, eliminating duplicate content, using internal links smartly, encouraging more inbound links, and analyzing competitor behavior. – Maximillian Laumeister Dec 06 '22 at 21:24
  • In weeks? Not possible. Your timeline is going to be years. – Stephen Ostermiller Dec 06 '22 at 22:28

0 Answers0