Last week, Google named some JavaScript issues that can negatively impact a site’s search results, and said it would soon be releasing a tool to help webmasters better understand how it renders their site. The tool has now been announced.
It comes in the form of an addition to the Fetch as Google tool, which lets you see how Googlebot renders a page. Submit a URL with “Fetch and render” in the Fetch as Google feature under Crawl in Webmaster Tools.
“In order to render the page, Googlebot will try to find all the external files involved, and fetch them as well,” writes Shimi Salant from Google’s Webmaster Tools team. “Those files frequently include images, CSS and JavaScript files, as well as other files that might be indirectly embedded through the CSS or JavaScript. These are then used to render a preview image that shows Googlebot’s view of the page.”
“Googlebot follows the robots.txt directives for all files that it fetches,” Salant explains. “If you are disallowing crawling of some of these files (or if they are embedded from a third-party server that’s disallowing Googlebot’s crawling of them), we won’t be able to show them to you in the rendered view. Similarly, if the server fails to respond or returns errors, then we won’t be able to use those either (you can find similar issues in the Crawl Errors section of Webmaster Tools). If we run across either of these issues, we’ll show them below the preview image.”
Google recommends making sure Gooblebot can access any embedded resource that contributes to your site’s visible content or layout in any meaningful way to make it easier to use the new tool. You can leave out social media buttons, some fonts and/or analytics scripts, as they don’t “meaningfully contribute”. Google says these can be left disallowed from crawling.
Image via Google