As announced on Google webmaster central blog, Google has confirmed its now crawling RSS feeds to discover fresh content and links.
While it has been known for sometime now within the search industry that Google actively crawls RSS feeds, Google has never officially confirmed this as part of their content discovering practice until now.
Just how much weight RSS feeds impact the indexing factors remains to be seen, but now having confirmation, webmasters and SEOs should be recommending clients to setup valid RSS feeds and making them available to Googlebot.
Here are some tips to make sure your RSS feed is setup for Googlebot crawling;
- Make sure your local RSS is not been block in robot.txt ( check this in Google webmaster tool )
- Link to your RSS feed via a standard HTML link on your website ( don’t embed your RSS URL in JS, flash or behind a singup form.
- Make sure your RSS feed is valid, while not required for Googlebot, a valid RSS feed is future proofing access for other search engines and is not hard to setup ( http://validator.w3.org/feed/ )
- Setup automatic RSS feed generators. Especially important for blogs and news websites, any new content should be automatically added to your RSS feed once it has been published. ( many blog platforms have standard plugins to achieve this )
- Keep your RSS feed clean. Try not to add additional content to your RSS feeds, such as including comments, advertisements. If you depend on ad revenue from your RSS feeds, try creating separate feeds and use robot.txt to block Google from accessing the ad-content feeds.
share this post http://bit.ly/2vdDnx | Sphinn this post