Sitemap and Robots in Sitecore Multisite
In the continuation of my previous blog Steps to be Performed after Multisite Configuration, I am going to discuss two important functionalities in terms of Google search crawling ranking and every website core functionality which are Sitemap and Robots txt implementation.
-
Create a new Template with two fields Sitemap and Robots.
- Inherit the above template on the Home page template of both Site nodes.
-
Create a patch file:
This patch file will have one pre-processor to allow the URL extension for XML and Txt files and another processor in the httpRequestBegin pipeline to serve Sitemap and Robots txt files.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characters<pipelines> <preprocessRequest> <processor type="Sitecore.Pipelines.PreprocessRequest.FilterUrlExtensions, Sitecore.Kernel"> <param desc="Allowed extensions (comma separated)">aspx, ashx, asmx, txt, xml</param> </processor> </preprocessRequest> <httpRequestBegin> <processor type="WebsiteNameSpace.ProjectName.RobotsSitemapTextProcessor, WebsiteNameSpace.ProjectName" patch:before="processor[@type='Sitecore.Pipelines.HttpRequest.UserResolver, Sitecore.Kernel']" resolve="true" /> </httpRequestBegin> </pipelines> -
Create a class in your project for the custom processor and use the
below code
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characterspublic class RobotsSitemapTextProcessor : HttpRequestProcessor { public override void Process(HttpRequestArgs args) { try { HttpContext currentContext = HttpContext.Current; if (currentContext == null) return; string requestedURL = currentContext.Request.Url.ToString().ToLower().Trim(); if (string.IsNullOrWhiteSpace(requestedURL) || (!requestedURL.EndsWith("robots.txt") && !requestedURL.EndsWith("sitemap.xml"))) return; var isRobotsRequest = requestedURL.EndsWith("robots.txt"); var defaultContent = isRobotsRequest ? $"User-agent: * {Environment.NewLine} Disallow: /sitecore" : $"<sitemap> {Environment.NewLine} </sitemap>"; if (Sitecore.Context.Site != null && Sitecore.Context.Database != null) { Item homeNode = Sitecore.Context.Database.GetItem(Sitecore.Context.Site.StartPath); if (homeNode != null) { var contentField = isRobotsRequest ? homeNode.Fields["RobotsContent"] : homeNode.Fields["SitemapContent"]; if (contentField != null && !string.IsNullOrWhiteSpace(contentField.Value)) defaultContent = contentField.Value; } } currentContext.Response.ContentType = isRobotsRequest ? "text/plain" : "text/xml"; currentContext.Response.Write(defaultContent); currentContext.Response.End(); } catch (Exception ex) { Log.Error("SitemapRobots", ex); } } } -
Deploy your application and browse pages:
Website-a.com/sitemap.xml
Website-b.com/sitemap.xml
Website-a.com/robots.txt
Website-b.com/ robots.txt
If you want to implement the same functionality using ASP.NET HTTP Handler you can refer to this blog post: http://sitecoreclimber.wordpress.com/2014/07/27/sitecore-multisite-robots-txt/
Happy Sitecoreing 😊
Comments
Post a Comment