1

I have a web site with a lot of web-pages and images. I created the sitemaps (grouped into single sitemaps index file). I submitted those sitemaps to Google but the indexing process is disappointingly slow:

Web-pages and images submitting ad indexing

Sitemaps was submitted on Feb. 4th. Submitting process started on Feb. 8th and so on:

02/09/2014 - Indexed 187 images
02/10/2014 - 956
02/11/2014 - 1180
02/12/2014 - 1196
02/13/2014 - 1198
02/14/2014 - 1192 (!!!)
02/15/2014 - 561 (!!!)
02/16/2014 - 1144
02/17/2014 - 1144

So the images indexing process is weird.

However the crawling process look much more attractive and promising:

enter image description here

Do I need to wait some extra time? Or do I need to take some actions to help Google to index my web-site correctly?

Roman Matveev
  • 931
  • 1
  • 11
  • 24
  • 1
    You can submit pages individually using "Fetch as Google" from Webmaster tools. You can use this tool to fetch up to 500 URLs a week. Not sure if it works for image search. – Osvaldo Feb 17 '14 at 17:51
  • Thanks, @Osvaldo, this could be helpful for some small to average blog, but definitely not for a project like mine which has up to 1,5 thousand of new pages every day :( – Roman Matveev Feb 17 '14 at 17:53
  • Strange thing: I just checked up that many of successful web-sites HAVE NO /robots.txt file at all. Which means that there is no sitemap file too. The picture of the world significantly changed in my mind... – Roman Matveev Feb 17 '14 at 18:07
  • 3
    @RomanMatveev You don't need a robots.txt in order to have a sitemap, though you can indicate the location of your sitemap in one. I'd suggest following the Image publishing guidelines and submit an image sitemap, then wait at least five days as covered there. As also indicated there: Google doesn't guarantee that we'll index all of your images or use all of the information in your Sitemap. – dan Feb 17 '14 at 18:11
  • 1
    Am I understanding that you put hundreds of thousands of items into one site map file? The limit is 50,000. You would need to create a site map that lists site map files that have 50,000 or less each. I use a limit of 45,000 to be safe. This can be found on http://www.sitemaps.org/protocol.html. You do not have to use compressed files for this as seen in the example. – closetnoc Feb 17 '14 at 21:26
  • 2
    @closetnoc he already mentioned he used sitemaps within a sitemap index file. – Max Feb 18 '14 at 02:43
  • Okay. How it was worded made it sound like a single file to me. I even re-read it to make sure. Funny how sometimes we can read things differently. My apologies. – closetnoc Feb 18 '14 at 03:27
  • forget about sitemap file, if your site really have 1.5k useful page per day, google will come to your every hour. This is what happen to stackexchange and stackoverflow. But then if want your image index faster, perhaps you will need 1.5k new image every day. – Peter Feb 18 '14 at 03:52
  • @Peter, that's right. And the images comes even in a bigger amounts. So your suggestion is just wait a bit more? – Roman Matveev Feb 18 '14 at 07:19
  • I would suggest that every of the image must have good file name as well as the alt. Since you will adding huge no. of image daily, definitely google will crawl much more frequent in the future. – Peter Feb 18 '14 at 09:35

1 Answers1

1

There is no way to "force" Google to index anything. The best that you can do is make it easy for Google to index your content and gain reputation.

Providing a sitemap, will tell Google about the content, but to get it indexed you need to get link juice to it as well. See The Sitemap Paradox for further reading. You need inbound links into your site, and then your site needs to link to all the content internally.

There is just no way that Google will index hundreds of thousands of items from a brand new site with no reputation. You will only get that much content indexed with time and many other sites linking to recommend your site.

Stephen Ostermiller
  • 98,758
  • 18
  • 137
  • 361