2

I have site-shop and the platform created a sitemap.xml automatically. But in Google Search Console there is an error because robots.txt like this:

User-agent: *
Disallow: /my/
Disallow: /cart/
Disallow: /checkout/
Sitemap: sumki5.ru/sitemap-shop.xml
Crawl-delay: 5

User-agent: Yandex
Disallow: /my/
Disallow: /cart/
Disallow: /checkout/
Crawl-delay: 5
Sitemap: sumki5.ru/sitemap-shop.xml
Host: sumki5.ru

Is this normal? Can it be a problem for SEO?

Stephen Ostermiller
  • 98,758
  • 18
  • 137
  • 361

1 Answers1

2

The sitemap URL has to be absolute. In your case it is missing the https://. Change the line to:

Sitemap: https://sumki5.ru/sitemap-shop.xml

And Google will stop complaining in search console.

If you don't fix the problem, Google will not be able to access your sitemap. That isn't necessarily a disaster. Sitemap files are not needed for good SEO. In fact, they really don't help at all. What they do is give you additional insight into your site in Google Search Console. See: The Sitemap Paradox

Stephen Ostermiller
  • 98,758
  • 18
  • 137
  • 361