0

I need around a week to transition a heavily data-driven website from one back end to another. During that time I do plan to attempt to keep some pages live, but they won't all work well or look brilliant. Some pages won't work at all.

What is the best way to ensure I don't scare Google? Should I hide everything from robots.txt, or mark everything that doesn't work as "503", or are there other things that I should be considering?

1 Answers1

2

503 everything is the best strategy I can think of, together with a retry after HTTP header.

Source: http://googlewebmastercentral.blogspot.co.il/2011/01/how-to-deal-with-planned-site-downtime.html

Zistoloen
  • 10,036
  • 6
  • 35
  • 59
Roie Speiser
  • 1,085
  • 6
  • 8
  • Great answer, thank you: great to see that Google have already thought of this one. I hadn't considered the "retry-after". – jamescridland Jun 01 '14 at 18:23
  • Thanks. I would also consider a custom 503 error page. You don't want to scare off your visitors with the ugly default message. Analytics code on this page is also recommended. – Roie Speiser Jun 01 '14 at 19:08