Sitemap & Robot.txt
Updated over a week ago

Google will index your website also without a sitemap or robot.txt, but both will help Google to understand your site structure better and index faster your website.

A sitemap.xml is a file in which helps search engines to index all the pages on the site and understand their structure and when they have been updated.

The robots.txt helps to control crawling by search engine robots. Also, a reference to the XML sitemap is included in the robots.txt file to tell crawlers what URL structure a website has to improve crawling and indexing.

Vsble generates automatically your sitemap and robot txt, so that Google and other search engines can read and update your site structure ( new pages, cagegories, content and so on) automatically, each time you edit your website. This helps Google to index your website even more accurately and faster than by default.

While you do not need to submit robot.txt to Goolge you can submit your sitemap to Google for faster and more precise indexing. To do so.

First submit your website to Google Search Console:

1. In your Vsble dashboard under SEO settings, turn on the switch to auto-generate sitemap.xml and robot.txt

2. Copy the sitemap link

3. Select sitemaps in your Google Sarch Console

4. Paste the link to your sitemap and click "send" - that's it!

Once the sitemap is submitted and Google scanned, the status will change to "success" - after that Google "See" every page on your website and track updates.


Did this answer your question?