I have site-shop and the platform created a sitemap.xml automatically. But in Google Search Console there is an error because robots.txt like this:
User-agent: *
Disallow: /my/
Disallow: /cart/
Disallow: /checkout/
Sitemap: sumki5.ru/sitemap-shop.xml
Crawl-delay: 5
User-agent: Yandex
Disallow: /my/
Disallow: /cart/
Disallow: /checkout/
Crawl-delay: 5
Sitemap: sumki5.ru/sitemap-shop.xml
Host: sumki5.ru
Is this normal? Can it be a problem for SEO?