Google Encrypted Search

It’s been a while now since Google announced that they were planning to remove keyword level referrer data from search traffic from logged in users on Google.com, and despite the initial hysteria about the SEO industry dying (again) as a result of it, the majority of things I’ve heard so far have been that Matt Cutts’ original prediction about the change affecting a single figure of traffic percentage wise seem to have been borne out.

Keyword Cat
Keyword Cat

That’s not to say I’m in favour of it.  I’m not.  I do believe that Google’s motivation is about protecting user privacy, but I also think that the main thing they’re protecting is their business model.  I think that they’re looking at ways of blocking services from AdBrite from taking referrer data to serve appropriate advertising.  That might smell of of anti-competitive behaviour.

BUT.

It also reflects a wider change in the market towards user concern about the way their data is being used and shared online.  I doubt that many of the websites running a lot of contextual advertising solutions have an appropriately worded privacy policy about how and where data is going to be used, and that probably gives Google more headaches than the SEO industry ever will.  When you search *(in Google) for the term “Google Sued Privacy”, there are 14 million results.  That’s not to say they’ve been sued 14 million times, just to say that when they do get sued, it gets noticed.

Rightly or wrongly, Google are being held up as the face of the Internet, and that’s attractive to lawyers looking for profit.  With their seemingly bottomless pockets, Google are ripe for a little bit of legal action, and when you have a legal establishment that lacks fundamental understanding of the way the web works, you have a situation where it becomes pretty easy to get an expert testimony that looks pretty damning.

Here’s a quote from an October 2010 article about yet another class action suit about Google privacy:

User search queries, which often contain highly-sensitive and personally identifiable information, are routinely transferred to marketers, data brokers, and sold and resold to countless other third parties.

Sound familiar?

From my perspective as a marketer, losing data hurts.  It hurts because at Latitude, we use that data to improve the targeting of campaigns and deliver great results for our clients, but it’s not the end of the world, it’s an opportunity to ensure that we use data better, ensure that what we have is as accurate as possible so that we can glean more from it, and use that data more effectively and efficiently.

According to Search Engine Watch, Google had around 1 billion unique users in May, and according to this Telegraph article, there are about 260 million Gmail accounts.  Worst case scenario – we lose referrer data for around 26% of users.

Google User Breakdown
Google User Breakdown

Of course that will never happen, not every GMail account is genuine (confession:  I have 7 Gmail Accounts), and not everyone is always logged in when they search.  The SEO industry will still have data, not as much maybe, but plenty to base assumptions on.   Which based on the articles I’ve read over the past couple of weeks on this subject is something that we do pretty well already.

No need to panic - we have a sample
No need to panic - we have a sample

 

Better Engagement Strategies

There’s an acceleration in the  importance of social signals to search engines as part of their ranking calculations., and from my perspective, these fall into two main categories:the engagement of users with content, and the willingness of users to share content.

Engagement

Engagement relates to the way in which users interact with a website throughout their search process. Duane Forrester from Bing was interviewed on Stone temple Consulting, and talked about user interaction within the search results as being a key social factor in the Bing ranking algorithm.  Elements such as Click Through Rates (CTR) from the search results, bounce rate and the length of a session in terms of pages viewed and time spent on the website, and also the number of returning visitors are all metrics that are considered.  When you think about the volume of data Google collects via their cookie and analytics, this seems like a no brainer.  There are ways to improve this.

Improving CTR

In search, your ability to communicate with users is relatively limited. The average listing is pretty limited to just 66 characters for the title, and 160 characters for the description.

We use meta descriptions to communicate the USP of your website, but if our focus is to deliver a higher CTR from the search results, the  most effective message needs to be identified and used.   This can be achieved in many ways including by using  the ad-text variations that deliver the best CTR in paid search and use these as the basis for a more successful message in natural search.

Emotive terms such as “free” and “discount” tend to have a positive effect on CTR, but specific numbers can also catch the eye.  Competitive prices  help to prequalify users but discounts such as “save 20%” also work pretty well.

Reducing Bounce Rate

A high bounce rate is not always a bad thing. There occasions where a user might be looking for a quick answer to a question and find that answer within a few seconds of arriving at your site. This isn’t always the case and it might indicate a lack of satisfaction.

Compare bounce rates across pages that share function to see whether there is a pattern which means that one page might have a higher bounce rate than another.   This might be the type of search term that people are using to enter the page, it might be the price of an item, it might be because your site has a horrible picture of a product.  If your site has a consistently high bounce rate, it might be because your design is horrid.

Increasing Session Length

Encouraging users to explore more of the website is always helpful from a marketing perspective because it exposes customers to more products.  There are loads of ways of encouraging people to explore.  Wikipedia always does this pretty well, but as a note of caution, this type of activity needs to be done gradually, because you always need to protect your conversion rate.

Sharing

Sharing involves transactions around links to content in Twitter, Facebook, +1 metrics, social bookmarks, and also the number of people who link to your content from other media such as forums and blogs. Google access sharing data via a number of sources from public social media profiles, monitoring traffic flow via analytics and also via the goo.gl URL shortening service.  They also have a small social platform of their own ;-).

Old school links are dying, because they don’t reflect genuine user behaviour any more.  Greater personalisation within the search results means ever greater value is placed on real human linking techniques.

Sharing existing content

You probably already have bookmarking / sharing functionality on your site, in the form of Facebook, Twitter and +1 buttons.  The question is whether you test them and treat sharing as a goal.

Try making the buttons bigger, and experimenting with different placements to make them more prominent.  Zero share counts can reduce user propensity to use the buttons whereas an active community can self promote.

New Content

I promised myself that I would never use the phrase “content is king”, and I just have, because it is.

Creating great content that adds value to any website needs to be at the core of customer centric SEO. if you don’t have a blog of some kind, you need to ask why. A blog is the foundation of a corporate community.  It is an opportunity to communicate values beyond a simple description on a page and allows greater engagement with users.  Good blogs bring personality to a site and give a business an opportunity to expand their brand into new areas as well as giving people an opportunity to feed back.

A well balanced blog informs users, gives advice, and of course sells, subtly.

Product Posts

You’ll always end up with some posts that are there to sell less subtly.  Thanks to the freshness elements of Google’s algorithm these posts can rank pretty quickly for competitive terms.

  • New this month…
  • Great value discounts across the XXX range…
  • Christmas gift ideas under £50…

These blogs could all link to individual products using keyword rich anchor text that helps to boost search engine rankings. The freshness elements of Google’s algorithm will also reward new content with higher initial rankings as the content will be perceived to be highly relevant when the post is published.

Optimising posts around individual product names should result in quick SEO rankings albeit for a short period of time. Provided a clear path to conversion is present from such blog posts, there is no reason why traffic from this source shouldn’t convert as well as any other.

Informative Posts

These are posts that are designed to provide guidance users.  They shouldn’t be explicitly sales led, but they’re useful for linking to relevant products that might help to close the circle:

  • 5 great  tips…
  • A guide to something…
  • Helpful advice for doing something better…

These types of post should be written as useful resources that can rank for search terms outside the key product set, but could still potentially encourage sales. In most cases, these types of informative post should be the ones that will deliver the most value in terms of shares via social media as they are perceived as being advice that people are comfortable sharing with their own network.

Culture Posts

These are posts which tell the story of a business and do something that goes beyond self promotion. If anything, these are the posts that will be shared least and read most, they are also the ones that are hardest to write because not everyone feels comfortable telling the world about themselves.

They give you a chance to tell customers about the real you, and provided that they have an authentic voice, they can be uber successful at increasing engagement with your brand.

Save Our Data

I’m not going to write a full post about Google taking away referrer data, but I thought it would be fun to create a couple of badges that you might want to use as a peaceful protest against it.

Here is a nice campaign badge for Save Our Data (SOD) that you might want to use on your blog:

Save our Data

And here is a frankly beautiful image that you could use as a friendly twitter avatar.

S.O.D.

Feel free to download and share…

Stop Using Obvious Anchor Text

I generally take the view that search engine algorithms are written to approximate human perception of relevance.  They are also structured in such a way as to apply value to their ranking factors based on generalisations of human behaviour.  In theory, a website that better reflects human behaviour will be one that receives better search engine rankings.

No one is going to argue that link building is not a massive part of any SEO campaign (OK, some people will…).  You need links to increase authority, you need links to build PageRank, and for many industries they’re not easy to come by, so you get webmasters of relevant sites to put a text link to your site on theirs.  I’ve heard that on occasions  money can change hands to cover any costs incurred, but that’s a debate for another day.

When an SEO person requests a link, they will generally specify the anchor text that they want to be used to link to the relevant page.  This is because, as everyone knows, a link that uses descriptive anchor text is better for SEO, and will help your site to rank for your chosen keyword, and make you pots of cash.

If you were building links for a site that sold something like car insurance, you might end up with a distribution of anchor text that looks something like this:

SEO Anchor Text Distribution
SEO Anchor Text Distribution

As an SEO, you’d be really proud of your great work in getting loads of (in this case 6) amazingly relevant links, and you’d expect to see some great improvements in ranking off the back of them.

But…

This method seems to totally ignore the way that real people behave when sharing content.  If you go back 5 years or so, the barrier to entry to publishing on the web was much higher than today.  Before Facebook or Twitter, you needed to have a website, which meant you needed to know some HTML, which meant that you were fairly web savvy, which in turn meant that you would probably have heard of accessibility, and would use reasonably helpful anchor text for links.  You would also probably link from a web page that contained all your other “useful resources”.

It’s now 2011.  There are 750 million people on Facebook and 200 million or so on Twitter.  There are more people sharing content now than ever before, and lets be honest, they don’t behave like web masters on the internet of yore.   I was running a training course on SEO for WebCredible yesterday, and I asked the attendees how they would link out from a blog.  This is the distribution of link anchor text that they said they’d use:

Natural Anchor text Distribution
Natural Anchor text Distribution

Notice the difference?

Under a paradigm where links are predominantly given by real people rather than incentivised web masters, the distribution of anchor text that is used is significantly different from that which we would choose our selves.  That’s hugely important, because if we also believe that the search engines are basing their relevance on the behavioural norms of “now”, a site that is adhering to the old ways of linking will quickly become obsolete from a user centric set of algorithmic rules, and be consigned to the lower reaches – because it is no longer demonstrably relevant to the users it is purporting to target.

If you come at this from the Vince Update angle that Google rewards brands, then the value of branded links is underlined, in red ink.

Internal Anchor Text

A quick note on internal anchor text.  This is the area that you have complete control over, and this does need to remain relevant to pages that are being linked to.

Online Reputation Protection

When you look at the traffic data from a well established website with a recognised brand you will see that around a third of traffic comes from brand terms, and that this traffic converts around twice as well as non brand or generic traffic.  In part this is due to the research / conversion process that users go through, but only a fool would argue that ATL advertising does not have some impact on visits from brand terms.

If you were to walk along the high street in any given town you’d be unlikely to  see a protest about a specific commercial entity, and you would also rarely see any evidence of graffiti saying that a particular business was a scam or provided a hideous customer service, however online it is a different story:  Bad reviews, blogs about poor customer experience and deliberate smearing of a business can be prominent when users are searching, and this can have a big impact on a user’s propensity to buy from a particular supplier.  It goes further than this of course, with Google Suggest and Google Instant, you can be alerted to negative sentiment about a company even before you get to the results page:

A bad suggestion?
A bad suggestion?

Will this put people off?  Probably.  If I was looking for a product and the company selling it was smeared as a scam, then I’d be a lot more likely to look into them and find out more.  If I didn’t have the time, I’d probably shop somewhere else.

What is happening?

You might have read the news about the Italian Wikipedia being closed down temporarily recently due to changes in the law in Italy that mean that potentially false  negative information about an individual or business must be removed within a short period or the publisher will face a fine.   That’s bad news for Italians, but it doesn’t go any way towards addressing the problem.  The internet has a long memory, and the nature of the social web means that content can be shared way beyond its original source very quickly into places where there is no editorial control.  With something like Google Suggest where visibility of keywords is based on user behaviour, you have a situation where negative information can continue to propagate itself indefinitely.

Why do you have a problem?

It’s really easy to publish to the web.  There is essentially no barrier to entry these days.  If I wanted to, I could anonymously register a Blogger or WordPress domain accusing a company of fraudulent practice or bad customer service in minutes, and write content that would see it do reasonably well in Google just off the fact that it is fresh and relevant.  The thing that stops me?  Inclination.  I need a motive to expend the effort in doing all that sort of thing.  That’s not to say I wouldn’t, it’s just that it would take a really  bad experience for me to be bothered to do it.

And that’s the root of the problem.

Companies that do not value their customers and treat them with contempt will inevitably annoy them.  A small grievance about service levels can easily become a vendetta if nothing material is done to resolve the problem.  Unhelpful customer service staff in an off shore call centre, a refusal to accept responsibility for a fault, a lack of response to a genuine grievance, a rude customer service rep will all increase the frustration that a person feels.

The wrong course of action

In too many cases, the first course of action – in many ways, the expected course of action for a company that does not value customers – is to try and bury the problem.  Writing fake reviews, astroturfing forums with effusive praise from sock puppets, registering hundreds of additional branded domains and launching microsites, and threatening legal action will do very little to solve the problem.  It might paper over the cracks for a little while, but it won’t have much of a long term effect.  In fact, it will probably do more harm than good.

There are too many companies who think that reputation management is something that needs to be done after the fact.  It has become crisis management rather than proactive brand building, but that is a mistake.  A reputation is earned not faked.  A company that tries to hide their negative reputation is like King Canute trying to hold back the tide.  They can stop the flow around the point where they are standing, but will ultimately fail.

The right course of action

The right thing to do is to treat complaints as the beginning of a process.  Understanding why someone complained is essential.  Rather than trying to hide from the problem, look at the root of that problem.

  • Does your customer service frustrate people?
  • Do you have a bad product?
  • Are your staff cynical and rude?

Then ask why is this the case?

  • Does your organisation not value customers?
  • Do you fail to deliver the service that customers expect?
  • Are staff properly trained?
  • Are staff motivated?

A common argument about delivering decent service is that it costs too much and that those costs would have to be passed on to consumers.  Bollocks.  The cost of decent service is nothing compared to the opportunity cost of having people put of dealing with your business because you are perceived to be thieves or scum.

The fact is that people will pay more for “better”.  People will pay more for security.  We might live in straightened times and economic insecurity, but we also live in an era of social media and instant feedback.  If you look at the PC or phone market, there is a movement towards companies like Apple.  When people have less to spend, they would rather spend that money wisely, and this means being more careful about who they deal with.

You can clean up the SERPS, but  what you really need to do is clean up your act.

Speaking at the Figaro Social Media Conference

Last week, on behalf of Latitude, I presented about social  marketing at the Figaro Digital Social media conference at the Magic Circle.  It was a phenomenal venue to speak at, and a really great day with some excellent presenters from a range of agencies and businesses who are engaged in social media.

I talked about a few different ideas during the session, stressing the need for proper audience research in social marketing.  I suggested that in many cases, smaller enterprises have a natural affinity with social marketing due to the way   in which they are more engaged with their customers as individuals than larger corporations can be.

I spent a lot of time discussing the need to be willing to market one to one with potential customers through social channels and the difficulty in making this approach scalable.  My proposed solution was to find a community of brand advocates within related forums and to reward them for their evangelism.  I touched on the importance of having specific goals for any social marketing activity in order to make it measurable while also stressing the difficulty

I referenced a 1989 paper by Paul Pangaro which had as its subject the architecture of conversation, which was produced following a study to improve organisational communication within large corporations.  I’d highly recommend reading this.

There were some other great presenters on the day including Sarah Carter from Actiance  John Bennet from Syzygy and  Hal Stokes at Essence.  Also worth mentioning was James Paterson who leads the social strategy for O2 who all delivered great insight into the state of the art.

I’ve embedded my slides from the event below.  Figaro are going to post the videos from the day on their site later this week.

Better Attribution of Sources

I’ve blogged before about the importance of linking out in order to establish credibility for your content in line with the expectations of Google’s PageRank algorithm.  Proper citation of sources is an essential aspect of the academic model that PageRank derives from and it lends a great deal of weight to content if it is to be trusted.

One of the biggest challenges of the real-time web is presenting authority.  It’s easy (some might argue too easy) to publish online, and misleading content can often be taken as fact, with significant impact.  Unfounded stories can create panic behaviour and result in un-forecast outcomes.

It’s theoretically possible to leverage one of the major wire services to disseminate false information, and have it appear in a respected second party source like AP or Reuters with very little editorial control.  When these sources are picked up by Google News, readers can be exposed to false content very quickly.

Google would argue that there are already controls in place around syndication through Google News, but given the ease with which speculative articles like this one can appear alongside verifiable sources, there is still some way to go in monitoring the content published.

Last year Google announced their new syndication-source and original-source meta tags which are designed to provide attribution to the source of content being published in Google news.  Although the blog post that announced them is not explicit in stating that these sources will be used in determining the validity of sources being considered for Google News, but there are comments in the post that point to that conclusion:

we’re hopeful that this approach will help determine original authorship, and we encourage you to take advantage of them now

Google have just announced a further link tag for news publishers that is intended to further credit original work.  rel=standout is intended to be used to credit a primary source of a news story in order to add value to the content that is being published.  This time, Google have been explicit in stating the importance of linking out:

Linking out to other sites is well recognized as a best practice on the web, and we believe that citing others’ standout content is important for earning trust as you also promote your own standout work

With more and more people using the web as their sole source of information, authority and verifiability are of increasing importance to readers, and need to be a key focus for publishers.  Automated curation is no replacement for editorial oversight, but the ability to understand primary sources is of huge value to users and publishers alike.

Conversion Friendly Websites

Yesterday, I blogged about the need for conversion friendly web design, which was inspired by a news story that was doing the rounds about how online retailers were losing sales because of the way they presented their sites.  I thought it would be a good idea to follow it up with some principals that are useful to consider when optimising a site for conversion.

Understand the Goal

Seems simple right?  Actually, not.  Even sites that convert pretty well often just seem to have a goal that equates to “a sale”.  What about “sell more”.  A user who purchases one item is not a profitable user.  think about the cost of acquisition, the margin on the item, the cost of delivery, the cost of the site in general.  Focusing just selling something is not good enough, you need to encourage customers to buy more of what makes you the most money.  A goal which is to “sell more” rather than just “make more sales” will make you more money.  Simple.

Map the Path to the Goal

OK, this is Usability 101.  Having a clear understanding of the entire customer journey from research to purchase, and the touchpoints with your site along the way is essential.

  • What search terms to they use?
  • What pages do people they enter on?
  • How do they find the product they’re looking for?
  • What are the steps from that page to the checkout?
  • Where do customers opt out of the journey?

Analytics will help you here.  A lot.  Use funnels to measure user interaction, aggregate data into measurable chunks, understand the barriers to conversion.

User Journey Funnel
User Journey Funnel

It’s useful to write the story of a user interaction with your site to see the points at which an unfavourable choice can be made:

Jeff arrived on the shoes page of our website after searching for “blue shoes”.  He had a lot of different styles to choose from, but he couldn’t see  any blue ones, so he left.

Melissa came to our website to find “red court shoes” and spent some time browsing the court shoes range, because we had more than 50 pairs to choose from, she took her time, but only looked at the first page of the results, and didn’t find what she was looking for because there weren’t any red ones.

In both of these examples, the user came into the site using a search term that didn’t really apply to the page they were looking for.  This might be because the categorisation of products in the site does not match user behaviour.  To convert your customers, you need to understand how they buy things and create a site architecture that matches this paradigm.

Simplify the user journey to the point where buying is easy and all the steps are clear.

Design Appropriately

For some reason, we seem to have arrived at a place where beautiful design is generally being ignored in favour of functional brutalism.  Grids of products and thumbnail sized images are what too many online retailers think is but why?  Think about how shops merchandise their products.  For fashion they create outfits, for electrical goods, they show you a set up of how the product could look in your living room,

When you buy a new car off the lot, you can see it from all angles, polished to perfection, and placed in a location that shows it off, not dirty in a parking lot.

Discretionary purchases are about aspiration.  Make customers want to buy from you by making the products look better on your site than a competitor.

Having a consistent design language that customers can identify with is an art form.  It needs to be appropriate to your business goals and your brand identity.  Design is powerful.  Think about Apple, Google, Prada.  Apple in particular.  Design is an investment, and it is not an optional extra.  It is how you communicate with your customers.  If all you want to be is a grid of other people’s products, then you create an environment that promotes other products and doesn’t differentiate.

Incentivise Up Sell at the Checkout

Most retail sites now think about the ability to add impulse buys at the checkout.  Supermarkets do this well, but these products are not always appropriate to the sale.

A busy checkout
A busy checkout

Use data intelligently.  Look a the products that people commonly buy with the exact product your customers have bought, not similar products, exact products.  Create Point of Sale displays that allow you to complement one product with another and show how they work together.  Someone’s buying a laptop?  Show them the right docking station, or speakers from the same design set.  Someone’s buying a flatscreen TV, show them a wall bracket.

The incentive?  Offer them the second product at a discount, or give them free shipping if they push to the next tier:

  • Get 10% off this Asics Running Vest when you buy it with your Asics Trainers
  • Free Shipping on Sony Flatscreen TVs when bought with a Sony Blu Ray Player

Keep the upsell simple by only offering one or two products, and make sure that they can be added straight to the cart so you don’t interrupt the user journey.  Upselling with targeted impulse buys should not make it harder to buy the first product.

Test and Refine

The key aspect of a conversion friendly website is one that is flexible, your customers will evolve, and so must you.  Build flexibility into your site so that you can change things and test different layouts.  If your design is static and you have made a mistake, you are stuck with it.  If you can’t change imagery or categorisation, or page layout, or conversion paths, you can’t improve them.

Use tools like Click Tale and Google Website Optimiser to understand and improve user experience, and remember that the key element of all this is the user:  A happy buying experience is an experience, shopping is an event, buying is a conscious choice.

Conversion Friendly Web Design

There was an interesting story on The Independent today that cited a study about how badly designed websites have cost businesses around £500 million in sales over the past 3 years.  At first glance, that’s a statistic that makes you think that the phones would be ringing off the hook at design agencies around the country as marketing managers frantically look at CRO solutions that might help them get back some of the business that they’re losing.

It’s a great headline – £500 million is a lot of cash, but … it’s over 3 years, which means that it’s around £167 million per year.  According to IMRG, online retail represented £31.5 billion in the first six months of this year, and is growing at around 15% per year, the £500 million suddenly seems like a lot less – let’s say that cumulative sales over the last 3 years have been around £155 billion, the £500 million is just 0.3% of the total.

Also, it’s important to note that according to online retail expert Jonathan Marsh:

Companies need to realise that it is not like the high street, people who are not sure do not have to walk 100 yards down the road to go to someone else’s store; they just have to click ‘back’.

It’s not necessarily the case that those sales have been lost – they’ve just been mopped up by people further down the rankings.  If Jeff doesn’t find it easy to buy what he wants on the shop that ranks number 1 for a product, he simply goes back to the SERP and clicks on the next result.  And lets be honest, even if the second (or third) result is still lacking in conversion friendly web design, at some point he’s probably going to battle through and buy anyway.  Sales happen, customers just need to be able to buy.

There are ways of estimating conversion rates for different sites even when they’re not publicly available, and understanding where your site sits on a league table is an important starting point for how conversion friendly your design is.

The double digit growth in online retail in the UK over the past few years has probably masked a lot of the effects of conversion rate.  I can imagine marketing teams talking up a 10% increase in sales as a success, but failing to take into account the wider market.

If you’re spending £100,000 per month to get 10000 sales and your top competitor is only spending £50,000 to do the same volume, then in effect they have a huge tactical advantage in terms of how much they can discount  their products to improve sales further., or reinvest the difference in budgets into other channels and grow their marketshare more efficiently than you.

The real surprise is how little work to make a website more conversion friendly costs as a share of the overall marketing budget, and you often raise an eyebrow when a company looks to spend more money rather than spending money more wisely.  The fact is: a poorly converting website is leaving money on the table.  A site that converts at half the rate of a competitor is effectively spending twice as much on each sale.