. May 19, 2022 . 4 minutes read
Let’s start with the basics. What actually is SEO? SEO, or search engine optimisation, is the process of ‘optimising’ your website to encourage users to visit, and ultimately complete a conversion via a search engine.
Usually this consists of trying to have your website rank as highly as possible within the SERPs, as this is where impressions and click through rate would be highest.
As alluded to above, having an effective SEO campaign can completely transform your online presence. SEO is vitally important for any website aiming to gain traffic, and as the Digital Marketing Institute says, “In short, SEO is crucial because it makes your website more visible, and that means more traffic and more opportunities to convert prospects into customers.”
When discussing an ‘SEO Housekeeping task’ you may wonder what we mean. We are essentially referring to tasks that are usually quite quick to carry out, and almost act as a health check for the website.
So, what should you be checking on a regular basis from an SEO perspective? Here are 5 SEO housekeeping tasks you should carry out regularly:
The first task we’ll discuss is checking response codes for target URLs, which can be a real quick and easy task - with great results if done properly. This task involves checking URLs across your site, and that any external backlinks referencing them are still pointing to the most appropriate URLs, which ideally return a 200 response code.
Over time, whether you’re making an active effort to build backlinks or not, your site will obtain links. It’s also completely normal for a website to delete some pages over time, which may result in 404 & 301 response codes. When a page is removed from a site, the value these URLs have built up over time may be lost. Specifically, if external backlinks now point to a 404 page as a result. Carrying out a review of your response codes for target URLs will ensure any historical value built up via external links can be re-harnessed and redistributed across the domain in question.
To complete a review of your target URLs, follow these steps:
1. Use ahrefs, Majestic or a similar tool to view your site's entire backlink profile.
2. Download your entire backlink profile to Excel/Google Sheets
(There is an option to view broken backlinks within ahrefs - however I prefer to do it this way just to be sure i have everything covered)
3. Run your target URLs through Screaming Frog (or any other website crawler)
4. Identify which URLs now return 4** response codes
5. Redirect any URLs to the most relevant URL still live on the site
The next task we’ll discuss is similar to the task above. Rather than checking links from external sites, you are going to check that internal links are all pointing to pages which either return 200, or 301 response codes.
Again, use Screaming Frog or a similar website crawler to identify any links pointing at 4** URLs. Once you have identified these URLs, either amend the links referencing the URL in question, or redirect the 4** URL.
If you have an abundance of links pointing to pages which no longer exist, this isn’t a great look when Google crawls the site. It’s also really poor UX, and will probably lead to a high exit rate on these dead links.
Very similar to the task outlined above, check that external websites and URLs you are linking to from across your site return the relevant page. If the page no longer exists, this is another case of poor user experience and will lead users to become frustrated.
Once again, crawl your site within Screaming Frog, go to the ‘external’ tab and check all status codes. Any which return 4** codes are your first port-of-call, either amend or remove links as soon as possible!
It goes without saying that your ranking position within SERPs directly influences the amount of traffic your site will likely receive. Whilst I usually encourage others not to provide clients with ranking reports, as they often become fixated on one keyword, ranking reports can be really effective when used efficiently by SEO’s. A task I would recommend to complete every 6-8 weeks would be to look at keywords currently ranking in positions 8-12.
Keywords in these positions provide real opportunity for your site. With just a few minor changes you could see these terms make their way further up the results pages, leading to more impressions and ultimately traffic (and conversions) for your website.
The last task we’ll discuss in this article (there are more to come in a follow up post), is to compared your XML sitemap with your crawl once every few months. As websites are ever changing entities, it is normal for pages to be added/removed over time.
As most XML sitemaps dynamically update, this usually doesn’t cause any issues for the majority of us, however I would still recommend manually checking this once every 2 or 3 months. The XML sitemap you have uploaded to your site allows Google to develop a better understanding of your site before attempting to crawl it, so ensuring there are no unnecessary pages within the file - and that everything that should be is included is vital.
Crawl your sitemap, and crawl your site. Export both into Excel/Sheets and conduct a VLOOKUP between the two. If there are any discrepancies, ensure to add/remove URLs in the XML sitemap as appropriate.
If you want some advice on what changes you should be looking to make for keywords in these positions, give Statuo a call today on 01204896440 or get in touch via our website.