Google has added a useful new feature to their Fetch as Google tool. They announced the news earlier this week. The new ‘Fetch and Render’ facility allows webmasters to discover if any resources on their pages are being blocked. This is very useful because if certain page elements are being blocked the page itself may appear quite normal to the webmaster, and to site visitors, but search engine spiders may see things differently and that can interfere with rankings.
Fetch as Googlebot
The Fetch as Googlebot feature was added to Google Webmaster Tools in October 2009; so it’s been around for quite a while now. It allows the webmaster to submit individual URLs and take a look at the HTTP response the server returns and examine the HTML—great for diagnosing problems. The tool was improved, in 2011, when Google made it possible for webmasters to use the fetch feature to submit pages to the index, an action that can result in fresh or updated pages appearing in the SERPS very quickly indeed.
How to Fetch and Render as Googlebot
To use the new fetch and render facility webmasters need to visit Google Webmaster Tools and log-in in the normal fashion, select the ‘Crawl’ option from the list at the left of the page, and then click ‘Fetch as Google’ on the drop-down menu.
Users who are familiar with the Fetch as Google tool will not notice many changes to the page that opens up. The only difference is the addition of the red ‘Fetch and Render’ button and a drop-down menu that allows the selection of a device type so that webmasters can check how the page renders for different devices including desktops and smartphones. URLs are entered in the usual way, the preferred device is selected from the drop-down menu, and then ‘Fetch and Render’ is selected instead of ‘fetch’.
Fetch and Render Benefits
Fetch and Render allow webmasters to take a look at the rendered page (as Google sees it) rather than just a page of code. In some cases the absence of important elements may be apparent straight away, (text pulled in from JavaScript files etc.). To take a look at the page click on it when it appears underneath the search box
The rendered page (as Google sees it) will be made available on the next screen. Further information on why pages might not be fully rendering or only partially rendering will also be displayed.
In some cases, listed items may be being pulled into the page from third parties such as Google AdSense or social media sites. Any problems of this nature are beyond the webmaster’s control and should not be worried about, but if the problem is an internal one, usually caused by blocks imposed by the robots.txt, webmasters can then take a closer look at the problem areas and rectify them as necessary.