SEO, or Search Engine Optimisation, has changed dramatically over the last few years. Gone are the days of paying for links or stuffing your website full of keywords to beat Google’s algorithms. Now, you need an established SEO strategy to get onto the search engine results – and it certainly doesn’t happen overnight.
But there are some simple and strategic steps you can take to ensure your website is search engine-friendly, and give it the best possible chance of moving up the rankings. Higher rankings mean higher click through rate, which means more traffic and more business!
1. Analyse the user experience
To fully assess your website for SEO, it’s best to detach yourself from any personal ties to the website. What you might like, your customers might not- and they are what matters. Run experiments by having your co-workers, distant relatives and neighbour’s dog find parts of the website or a particular product. Are they having a positive experience on the website, or are they struggling to find something? Is it unclear where they should look? This is the first step to analysing the user experience.
If you want to go a step further, install some heatmap or recording software on your website and monitor how people are interacting with the website. Back up your thoughts with the cold, hard data in Google Analytics.
User experience is a big part of Google’s algorithms, as well as engagement and time spent on site, so consider all of these factors when looking into user experience and ease of use.
2. Check the site speed and mobile-friendliness
This one ties into user experience, but a fast site speed and mobile friendly website are crucial. Google actually indexes mobile versions of a website first, to really drill down into the importance of having a mobile-friendly website. But having a fast website not only boosts your website rankings in the search engine results, but will mean less people abandoning the website during page load.
3. Page mark up
Best practice SEO means including certain elements of the page, such as title tags, meta descriptions and H1 tags. These should be optimised for the keywords you want to rank for to help rankings and to help both users and Google understand the page content. Tools like Screaming Frog allow you to download and quickly view the meta data of every page on your website, so you can optimise accordingly.
Remember that title tags and meta descriptions have character limits so make sure you aren’t too far over or under to fill the search engine results perfectly.
Title tag: 65 – 70 characters
Meta description: 155 – 160 characters
4. Broken links = messy website
Broken links, otherwise known as 404 errors, signify a messy back end as well as hamper a positive user experience. You can check if your website has any broken links using Search Console in an exportable CSV file. To fix an internal broken link, you’ll need to set up a redirect. We always use a 301 redirect rather than a 302 which is just temporary (unless it is actually just a temporary redirect). The best way to do this is to create a redirect spreadsheet and map out where the broken links should be going and then you can implement this in your ht.access file or download a plugin.
5. Crawlability and indexing
Google will crawl and then index a website, and this is how users will then find the website in the search results. A site that isn’t indexed won’t be showing up. To see how many of your webpages are being indexed, do a Site:yourdomain.co.uk
Search in the address bar and you should see a list of all the pages actively indexed by Google. Website not showing up at all? It may be that your robots.txt file is not allowing Google to index your website, which you can check by going to yourdomain.co.uk/robots.txt and seeing what is allowed and disallowed.
Make your website as easy to crawl as possible by adding internal links throughout your website. Avoid redirect loops and chains that can confuse the Google bots. Upload your sitemap to Search Console to help the crawlers understand what pages you have and where they are.
6. Backlinks to keep, backlinks to disavow
Backlinks can be both a blessing and a curse. High-quality backlinks to your website will increase your domain authority and improve your keyword rankings. Spammy, toxic backlinks will harm your rankings in the search results, in accordance with Google’s Penguin algorithm. Tools like Majestic, Link Research Tools and even Search Console will let you export your backlinks so you can check their quality.
Any links that won’t be adding any benefit to your website, such as spammy websites for link building purposes only should be removed. You can ask the webmaster to do this or, failing that, you can resort to disavowing them. This involves uploading a disavow file in .txt format to Search Console and Google will then ignore any links for the domains within.
This guide should help you start your SEO audit and thinking about some of the aspects that Google will love or hate about your website. But SEO isn’t a task that you can do once, and then forget about. SEO is a constant driver, so once you’ve carried out your SEO audit, there’s a strategy to establish!
If you’d like a hand with your SEO in Bristol, get in touch with SpiderGroup by calling us on 0117 933 0570 or fill in our contact form and we will get back to you.