6 Tips on How to Optimise your SEO Strategy for Bots

SEO for bots

When creating an SEO strategy, the design of it should have people in mind, while development should consider bots instead.

One of the ironies of web design is that a site will only have a chance of getting into the coveted top spots on Google if Google’s bot comes to the conclusion that a site is great for humans. GoogleBot does not decide if pages should be ranked highly. GoogleBot crawls the site and brings the data back to Google. It is Google’s algorithm that decides where the site ranks by using the data that GoogleBot brings back. This means that successful websites are optimised for both humans and bots and, fortunately, this is not a contradiction.

Bots are also known as site crawlers they can be perceived as digital site analysts.  They can’t judge content from an emotional perspective as humans do, but they can look at the mechanics of how a site works and form a very accurate opinion as to its usability and value to a human.

Even in 2018, they are still central to the functionality of search engines and that means they should be central to search engine optimisation.  Here are 6 tips for optimising your SEO strategy for bots.

  1. Make sure your site is actually as searchable as it should be

The first thing bots do is check where they’re allowed to crawl and they do this by checking a file called robots.txt.  If you’re in the habit of putting up pages and then listing them on robots.txt to make sure they are not crawled until they are actually live, then make sure you have a process in place to take them off the robots.txt file when you are ready to use them.

  1. Remember that sitemaps are good for bots as well as humans

Even if your site navigation has a fantastic UX (User Experience) and you are completely confident people can navigate their way around it without an HTML sitemap, it’s probably a good idea to add one anyway. People use HTML sitemaps when they are lost on your site. It’s also necessary to have an XML sitemap for crawlers. The XML sitemap acts in the same way a HTML one does for humans.

  1. Make sure all your site’s pages are linked to at least one other page

“Dead-end pages” are frustrating for humans and bots alike and so it is recommended to make sure that there is at least one clear way to move into and out of each and every single page on your site.

  1. Keep everything as simple as you can

Google doesn’t like sites using Flash and struggles to understand Javascript but it is getting better at accepting it, as long as you do it well. Most websites have Javascript on them and some are built mostly with Javascript.

  1. Pay attention to accessibility to help make life easier for crawlers

By accessibility, we mean as defined by the Equality Act 2010.  Your website should already be in compliance with this legislation and, if it is, you’re a step ahead when it comes to optimising your site for bots.  If it’s not, then this needs to be addressed as a matter of urgency.

  1. Keep updating your site with relevant, high-quality content

Bots look for signs of freshness and user engagement which would give a high score and boost your page ranking.  The way to achieve freshness is, of course, to keep adding new content, and the way to encourage user engagement is to make sure that content is relevant really adds value to their lives.