I have a project which aggregates public information on companies, startups, founders etc. for a total of >2 million tracked entities worldwide.
Yes, every single page is high quality, manually curated and vetted as much as possible to provide the best value to the customer. Trouble is, google is only crawling my website at the rate of 300 pages/day. That's 109.5K pages per year. At this rate, Google will never index my entire website.
To reiterate, I have over 2 million pages I'd like to get indexed. Here's what I've done so far:
- Squeezed out every bit of performance, so that Googlebot has more 'compute quota' to spend on my site
- Ensured high quality SEO to signal that yes this is a good website google, come and crawl please.
- Ensured high user value, again, I've made sure it is a good website that provides a valuable service and I'm seeing high CTRs/low bounce rates even at positions of 15+.
My website is two months, and only about 15k pages are indexed till now. I know this problem can be solved. If you google any of these: site:crunchbase.com site:owler.com site:zoominfo.com (which are my competitors), they have literally tens of millions of pages indexed. Forget competing, right now I'm just fighting for a seat at the table.
How to increase my indexing rate? I need something far higher than 300 pages/day, as much as possible really. (For reference, 100k pages/day would still take 20 days to scan my whole website)
(P.S: If anyone wants, I can link to the website here)