• Conversion Rate Optimisation

7th May 2015

9 min

So as retailers begin to serve personalised experiences to their visitors and test their websites, what will this mean in the future for SEO? This post is part one of two.

SEO & Personalisation Part 1: A/B Testing Best Practice

SEO & Personalisation Part 2: Can Personalised Content Damage your SEO

 

Shock-ed by David Goehrling
              Shock-ed by David Goehring. Flickr

Can A/B testing damage your SEO?

Does this question startle you? If it has, don’t worry! There is no need to panic. When speaking to people about optimisation testing, SEO never seems to be a core consideration when conducting tests, especially when CRO lives in a separate part of your business to organic search (which is not unusual).

When testing, however, SEO is a consideration that we suggest everyone takes note of. There’s no need to be scared, knowledge is power and as long as you are aware of risks and the SEO best practices you should be following, A/B testing will never negatively impact your SEO efforts.

With this being such a controversial subject, we teamed up with SEO expert Simon Fryer – Search Director at CandidSky – for this post and together our advice will help keep your search safe whilst A/B testing.

6 things you should know

With no public figurehead as such emerging from Google’s Webspam team, we don’t have the reassurance that Matt Cutts used to offer us. Once upon a time, if Matt Cutts – head of Google’s Webspam team – said that A/B testing is ok, then we would trust that everything was ok in terms of SEO.

Now we need to take everything that we have learned through the ever-evolving practice of SEO and apply it, in order to ensure the safety of our search engine rankings whilst redirecting to variations of landing pages for the purpose of optimisation.

1. Use canonical tags correctly

The rel=”canonical” tag is a way of giving your page a name so that Google can refer to it correctly.

<link rel="canonical" href="https://prwd.co.uk/" />

If you are running an A/B redirect test and have 2 variations of a page, Google needs to know that they should be considered the same, so as not to index a new page that has alarming similar content.

Simon says –

‘The use of a canonical link element primarily prevents search engines indexing a duplicate page and adjusting rankings on account of duplication. In addition, it passes around 91% to 95% of Link Authority from the donor page to the target. Therefore, if you intend to run a test for a long period of time and it’s likely that the page will acquire links, the canonical link element will ensure that any incoming Authority is attributed to the canonical page, improving its ranking potential in search engines.’

2. Robot.txt, Nofollow or Noindex, follow?

The web seems to have varying opinions on the question above. To remain safe when running an A/B redirect test, you can take an extra step towards informing or blocking Google from indexing the test variants. This is the kind of precaution you want to take when running a homepage test for example.

The argument, however, is do you edit your Robot.txt file to explicitly tell Google not to follow specific URLs? Effective, but not always possible for a business user. Do you place a Nofollow tag in the <head> of the test variants? Or do you place a Noindex, follow tag in there instead?

Simon says –

‘Whilst Robots.txt is a widely used method of preventing indexation, it’s completely unreliable. A disallow entry in Robots.txt provides meta directives at the root domain level and as such It’s very common for pages disallowed in Robots.txt to continue to be indexed when they are accessed directly by Googlebot or another crawler. Once a disallowed page is indexed it will remain in the index irrespective of the content of your robots.txt file.

This is an example of a site which has attempted to block content using robots.txt, but the content has been indexed –

nicole

This is not an effective method for controlling indexation, and may also impede any Link Authority arriving on variant URLs. Very few people seem to be aware of this.

To ensure variant URLs are not indexed, use the “Noindex, follow” meta directive on each page. As the directive is provided at the page level it’s 100% reliable. “Noindex” informs crawler to omit the page from their indices. “Follow”, as opposed to “Nofollow” directs them to continue to value inbound links to the URL, in addition to passing Authority through the site’s internal link structure. This is particularly important if you expect variants to earn external links.’

3. Consider content on test variants

“What would a big reduction in content do to SEO if the variant won?” is a question all Conversion Optimisers need to be asking themselves.

We’re not just thinking about if a search engine confused your variant for your control, but also what it means for the long term. You’re A/B test could give you an uplift in conversion, but that will be counteracted if in the long term it results in loss of listings. Our recommendation is to run your design and code past an expert for their approval if the change is radical.

Simon says –

‘Absolutely [consider content]. A reduction in content will not necessarily result in worse organic performance. In my experience, word count and ‘keyword density’ are not ranking factors, despite what some might say. The problem is if important phrases, modifiers and synonyms are removed, it could result in a downturn in organic traffic. My advice is to be especially careful with Meta Titles; these are the most important on-page relevance indicator (also your primary sales copy) and changes here will affect performance more than anywhere else on the page.’

4. Long term is too long

For us, a recurring issue is where a client gets a nice uplift from a test but doesn’t have development resource available to take the test out of the testing platform and hard code into the website. What this means is Google is crawling one page but sees that it’s different to the content that the humans are seeing. Google thinks…hmm suspicious, why would they want me to see something different to their customers?

It could be thought that you are practising the dark SEO art of cloaking, which is showing one version of content to Google optimised for search and another lot to your visitors. Trickstering in short.

Black cloak snip by Ian Burt. FlickrBlack cloak snip by Ian Burt. Flickr 

By adhering to Google guidelines it’s important to follow a SEO best practice rule of hard coding successful test variants at point of test completion and to avoid serving test winners to 100% of traffic for long periods of time.

We can’t find an example of Google penalising anyone for this practice, but this doesn’t mean there’s a horror story waiting to happen. To be better safe than sorry, complete your testing cycle with efficient delivery of winning variants to your website.

Google says –

‘If we discover a site running an experiment for an unnecessarily long time, we may interpret this as an attempt to deceive search engines and take action accordingly. This is especially true if you’re serving one content variant to a large percentage of your users.’

It couldn’t be any clearer!

5. Stay on top of the housekeeping

This is an easy one, run a test and clean up after it. Simple.

If you’re running an A/B redirect test with 4 variations including the control, once the test cycle is complete you’ll have accumulated 3 legacy pages. Now imagine you left those page hanging around in your content management system and a copywriter or a developer accidentally links to the wrong version of the page. You would have a problem on your hands.

Name your legacy test variants appropriately and archive them somewhere safe. This allows you to avoid human error, incorrect links being accidentally indexed and keeps your site nice and efficient.

6. Place redirects when an experiment concludes and make them permanent

When running an A/B redirection test, there is a possibility that whilst the test was running, the B variant may have received some inbound links. Once the test is completed, you don’t want to lose the SEO juice that those pages created. So, as part of your housekeeping tasks, it’s important to place permanent redirects on your discarded URLs and point them all to your control URL – the main URL that your site links to and is listed within Google.

For example when a test concludes and serve your winning design on your control URL, place 301 redirects on legacy pages

Nicole

To understand how to create a 301 direct, moz.com have an easy to follow guide.

What should you take away from this post?

It’s better to be safe than sorry and as long as you’re aware of the risks, your SEO will remain safe. In most cases, you will have nothing to worry about if you use a good testing tool and follow SEO best practice advice for when a test has concluded.

Of course, search engines continually change their algorithms and approach, so it’s important to continually evaluate your SEO practices. But there really is no need to panic, because Google aren’t out to destroy any hard work you have put in through a Conversion Optimisation programme:

Google says –

‘Small changes, such as the size, colour, or placement of a button or image, or the text of your “call to action” (“Add to cart” vs. “Buy now!”), can have a surprising impact on users’ interactions with your web page, but will often have little or no impact on that page’s search result snippet or ranking. In addition, if we crawl your site often enough to detect and index your experiment, we’ll probably index the eventual updates you make to your site fairly quickly after you’ve concluded the experiment.’

Next week is part two, detailing all you need to know about personalisation and SEO.

Lastly, a big thank you to Simon Fryer for his invaluable contribution to the post.

Leave a Reply

Your email address will not be published. Required fields are marked *