SharePoint Robots.txt File

As you all know that robost.txt is a file that defines what content a search crawler is allowed to index and not allowed to.
 
To change robost.txt in SharePoint environment .
 
1. Go to Site Settings.
 
2. Under Site collection Admin tab, activate the feature . This will register the Site Collection with the “Search Engine Sitemap job” Timer Job.
 
3. Now go to Site "Search engine sitemap settings." under Site collection admin tab
 
4. Now Search Engine site map opens with the default settings.

User-agent: *

Disallow: /_layouts/
Disallow: /_vti_bin/
Disallow: /_catalogs/
you can add more disallow tags to prevent them from crawling. Fantastic Isn't ?
Ebook Download
View all
Learn
View all