I have a robots.txt that

That is, if in mydomain.com  ends with the following lines:It will read both. If we add to this the fact that these sitemaps can be indexes that control where to read each file, we already have a mechanism similar to that of Google Search Console.

Of course, all of this has its risks, but if done well

It is a new door that should not be wast just because we are afraid to use it.

12. You can use sitemaps to encourage spiders to visit URLs that you want to be look at (but not index).

Finally, and with

A very practical use if we link it to the previous point (uploading sitemaps on your own), we have an old technique that is bas on uploading sitemaps.xml to force their crawling. It is known that when updating sitemaps, the URLs contain there stockholder database end up being visit sooner or later, but this is usually even faster (although it will not be immiate) if what we do is upload new sitemaps.

 

special data

 

So we have a tool uploading

New sitemaps that allows us to create lists of pending URLs for Google to crawl, which can be very useful, especially in cases where we no longer the power of conversion rate optimization offer links to these pages or because they have slow crawling, and we do not trust the work of the spiders.

I list some of these cases:

In a migration, leaving the old sitemap or even better yet, uploading a new sitemap of URLs with 301 helps them to be read sooner and ensures that all of them end up being read.

Indicate the sitemap.xml

In the robots.txt already directly pointing to another domain:
This is another resource document even in the documentation of sitemaps.org. There they tell us that we can include a sitemap in the “sitemap:” directive of the tg data robots file. A regardless of whether it is locat on the indicat domain or not.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top