AnQiCMS supports customizing the robots.txt file, controlling the behavior of search engine crawlers.

The role of robots.txt

The robots.txt file tells search engines which pages can be crawled and which are prohibited. Optimizes search engine indexing of the website.

Default configuration

AnQiCMS provides the default robots.txt configuration, allowing search engines to crawl public content. Generally, no modification is needed.

Custom configuration

You can edit the robots.txt content in the background, adding Disallow rules to prohibit crawling of specific directories or pages. Such as admin backend, search results pages, etc.

Sitemap declaration

It is recommended to add the Sitemap address in robots.txt to help search engines discover the Sitemap file.As for: Sitemap: https://www.example.com/sitemap.xml

Test verification

After configuration is complete, use the robots.txt test tool on Baidu Search Console or Google Search Console to verify that the configuration is correct.