When creating an SEO strategy, the design of it should have people in mind, while development should consider bots instead.
One of the ironies of web design is that a site will only have a chance of getting into the coveted top spots on Google if Google’s bot comes to the conclusion that a site is great for humans. GoogleBot does not decide if pages should be ranked highly. GoogleBot crawls the site and brings the data back to Google. It is Google’s algorithm that decides where the site ranks by using the data that GoogleBot brings back. This means that successful websites are optimised for both humans and bots and, fortunately, this is not a contradiction.
Bots are also known as site crawlers they can be perceived as digital site analysts. They can’t judge content from an emotional perspective as humans do, but they can look at the mechanics of how a site works and form a very accurate opinion as to its usability and value to a human.
Even in 2018, they are still central to the functionality of search engines and that means they should be central to search engine optimisation. Here are 6 tips for optimising your SEO strategy for bots.
- Make sure your site is actually as searchable as it should be
The first thing bots do is check where they’re allowed to crawl and they do this by checking a file called robots.txt. If you’re in the habit of putting up pages and then listing them on robots.txt to make sure they are not crawled until they are actually live, then make sure you have a process in place to take them off the robots.txt file when you are ready to use them.
- Remember that sitemaps are good for bots as well as humans
Even if your site navigation has a fantastic UX (User Experience) and you are completely confident people can navigate their way around it without an HTML sitemap, it’s probably a good idea to add one anyway. People use HTML sitemaps when they are lost on your site. It’s also necessary to have an XML sitemap for crawlers. The XML sitemap acts in the same way a HTML one does for humans.
- Make sure all your site’s pages are linked to at least one other page
“Dead-end pages” are frustrating for humans and bots alike and so it is recommended to make sure that there is at least one clear way to move into and out of each and every single page on your site.
- Keep everything as simple as you can
- Pay attention to accessibility to help make life easier for crawlers
By accessibility, we mean as defined by the Equality Act 2010. Your website should already be in compliance with this legislation and, if it is, you’re a step ahead when it comes to optimising your site for bots. If it’s not, then this needs to be addressed as a matter of urgency.
- Keep updating your site with relevant, high-quality content
Bots look for signs of freshness and user engagement which would give a high score and boost your page ranking. The way to achieve freshness is, of course, to keep adding new content, and the way to encourage user engagement is to make sure that content is relevant really adds value to their lives.
LEARN Highlights | Top SEO Trends Impacting Luxury In 2020
BERT: Google's new Algorithm Impact for Luxury Brands
Top tips on how to optimise your local SEO strategy for luxury hospitality brands
Verb Insider: Improving SEO for a Luxury Real Estate Brand
LEARN: SEO Content Strategy Highlights | February 2019
4 SEO Trend Predictions for 2019 and How to Take Advantage of Them