2013 was a tumultuous year for SEOs . Just when SEOs were finally getting used to the rhythm of changes with Panda & Penguin, Google changes the game with several updates that altered how SEO is practiced from this point onwards.

To get a better idea about the impact of these changes, I reached out to three top SEO veterans in the industry to get their “from the trenches” point of view and to find out how they are adjusting to them. Below is my distinguished panel of guests:

 

Jody NimetzJody Nimetz – Group Manager of Organic Search at Mediative

Twitter | LinkedIn

Jody has over 10 years experience with enterprise SEO working with clients such as  Grainger, ToysRUs, SurveyMonkey, Skull Candy, and American Red Cross.

 

Aaron Bradley

Aaron Bradley – Digital Marketing Consultant

Twitter LinkedIn

Aaron specializes in organic search engine optimization for enterprise ecommerce and news media sites. He stays up to date in the semantic web and structured data as logical extensions of current search technologies, with practical expertise in the use of schema.org and structured data markup.

 

Barry Adams

Barry Adams – Digital Services Director at Pierce Communications

Twitter LinkedIn

Barry is the Director of Digital Services at Pierce Communications in Belfast. He is also an editor and blogger at award-winning blog StateofDigital.com. Conference speaker and lecturer on SEO, Social Media, and Digital Marketing Strategy.

 

#1 – Google Hummingbird was a significant change for Google. According to Search Engine Journal, Hummingbird affected 90% of search queries and likely affected data retrieval, artificial intelligence, and how data is accessed and presented to users.

How has Hummingbird impacted your organic search traffic since its launch this past July?

Jody Nimetz Jody – To be honest, Hummingbird has not really impacted our client’s organic search traffic.  We have always been working with clients to make sure that their sites produce high quality content so that they are relevant to, not only the search engines, but the end user.

We know that Google is trying to produce a smarter engine so we work with clients to enhance their content so that it is relevant for Web and Mobile Search.  Google’s move to a more conversational search and serving up “smarter” results just lets us know that we are focusing on the right areas with our clients.

 

Aaron Bradley Aaron – Of course one of the amusing things about Hummingbird is that it’s the major Google algorithm update that nobody noticed.  Certainly in terms of the rawest metric – organic visits from Google – even a retroactive review fails to uncover any anomalies in the sites I have numbers for, regardless of the vertical:  news, content, ecommerce and information sites I’ve reviewed all fail to show clear evidence of the rollout.

Analysis of organic keyword-level data might provide more clues about the impact of Hummingbird if those data were reliable, but of course any such analysis is of dubious value because of the continued rise in the number of encrypted queries during this period:  in August “not provided” keywords were already running between 50% and 70% for sites in my portfolio.

This is not an unimportant conundrum for those trying to gain insights into Hummingbird, because it beggars belief to think that such a seemingly substantial change has had so little impact on observed traffic.  Indeed, I think the failure of SEO pundits (myself included) to satisfactorily account for this disconnect puts almost all subsequent “analysis” of Hummingbird’s impact into the “highly speculative” category.

Many have pointed out (Ammon Johns and Bill Slawski come to mind here) – I think correctly – that Hummingbird’s impact has been on Google’s ability to better understand queries (even to the extent of substantially rewriting them), making this an algorithm change without the traditional impact on the ranking of results, such as one could readily observe with Panda or Penguin.

 

Barry Adams Barry – It’s not had a big impact on our client sites so far. I’m not sure if this is due to a delayed roll-out in the UK, or if it just didn’t have a huge impact on the SERPs we’re targeting, but we haven’t seen any major shifts in either direction. Most of the SERPs we’re active in have not seen any sudden changes, such as knowledge graph boxes, not have there been dramatic ranking fluctuations.

I do expect to see a growing impact in the near future as the aspects of search that Hummingbird has enabled for Google will start to become more prominent, and we’ve already started rolling out changes in our tactics that prepare for where we believe Google search is heading, with special focus on more abundant use of structured data, social signals, and information-based content.

 

#2 – One of the theories around Hummingbird is that Google will now be intelligent enough to figure out synonyms and themes of a web page where it will recommend relevant web pages in search results even if your exact keywords aren’t contained in the web page. 

How has this changed how you perform keyword research and web page copywriting?

Jody Nimetz Jody – This is a fundamental shift in the industry.  Performing keyword research is no longer about mapping a single keyword to a given page.  When conducting keyword research we need to ensure that our keyword baskets are more well-rounded and reflective of what the client’s site is actually about. The one thing about keyword research is that it is iterative and that it needs to be revisited on a regular basis.

As for copy writing, well it’s really about enhancing the content and not just dropping in exact match keywords.  If you have a page about car insurance for example, you are not simply going to sprinkle the copy with that term, you might want to use additional semantically relevant terms such as “auto insurance” or others to build a theme around the page topicality.  Google is looking to provide answers to searchers quickly, you need to ensure that your content has the answers to those users questions.

 

Aaron Bradley Aaron – While Hummingbird is probably an update that more formally integrates this sort of deeper understanding of web resources – those “semantic search” capabilities – into the core of the algorithm, Google has long been moving in this direction, and has long been able to serve up results and recommendations based on the entities and concepts described by keywords.

That is to say, Hummingbird has been put semantic search in the spotlight, but it doesn’t represent its debut.  I think this is important to point out for two reasons.

First, it places Hummingbird properly in the continuum of Google’s march toward semantic search (recall that both the Knowledge Graph and schema.org considerably predate Hummingbird) rather than misidentifying it as a sudden departure from previously-employed search technologies.

Second, it helps to highlight the fact that the search engines – not just Google – have been increasingly employing semantic web technologies to improve their understanding of queries and produce faster, more accurate, more relevant results.  Bing, too, is perfectly capable of returning results based on the entities underlying keyword queries (I think they might even be doing a better job of this than Google for certain categories of image search):  this is a general change in the direction search engines are taking.

In this context, then, Hummingbird doesn’t change my approach to keyword research, but rather helps inform and refine the changes I’ve gradually made to such research in the past couple of years.

That shift itself has been one where now entities (both named entities and concepts) are now the primary mechanism by which I organize material, with keywords playing a supporting role (albeit a vital one).  So research questions start to deviate from the traditional monolith “what keywords are important here?  Instead my research now focuses on the things those keywords reference.  What are the properties of this thing?  With what other things does it share these properties?  What words and phrases are used by searchers to refer to these things and their properties?

This focus carries over to my approach to search-friendly copywriting as well, of course.  On one hand ensuring that concepts and named entities – rather than keywords and keyword variations – are covered off with relevant resources.  But it’s also had an impact on how I view and use links, which I now consider to be fully part of the copywriting enterprise, as they help the search engines understand the context of that copywriting, and the relationship between entities.

Barry Adams Barry – I think this is probably one of the over-hyped aspects of Hummingbird. For us, using synonyms and writing in a purely ‘human’ way has always been a part of the SEO process, as we felt targeting only one specific keyword on a given page was already a shortsighted and unnecessarily restrictive approach. Hummingbird only confirmed that we, like many other SEO agencies who did this early on, were on the right track.

 

#3 – In September Google officially moved all search queries into secure search and as a result they now do not provide any keyword search query data in Google Analytics. Most SEOs and digital marketers have used that data to support not only SEO initiatives but optimization of website user experience based on that data.

What methods are you using now to replace organic search query reporting?

How are you providing evidence to clients that your SEO recommendations are indeed working as planned?

Jody Nimetz Jody – It was really disappointing to see this data go.  For the simple fact that your analysis is only as good as the data that you have on hand.  Losing keyword referral data from Google was/is a big deal.

It was not entirely unexpected though.  It has however required some changes as to how and what metrics we report on.  For example it becomes more difficult to tell the branded vs non-branded traffic story.  We’ve had to make adjustments on what we report on and on the data that we are analyzing.

We can still report on search queries from other search engines and we can access a sample of query data from Google Webmaster Tools, but we shifted focus from keyword specific data to more page level data.  We evaluate specific landing pages and monitor how the traffic is trending for a given page.  We know that if we have “optimized” a page that we expect to see a positive lift in traffic or engagement with that page.  As a result we can determine how successful (or not) those optimization efforts have been.  Driving traffic to a website is only one part of the equation, what you do with that traffic is a whole other story.

 

Aaron Bradley Aaron – Some previously-reported metrics here simply have a new importance – most notably the number of visits originating from organic search, of course.  Also, I’ve never been one to eschew keyword ranking data, and continue to use this to inform the relative success of SEO efforts (with the proviso that, as noted above, I’ve made changes to the way that keyword data is organized).

Above all, effective SEO in the Hummingbird era should result in improved number of conversions from search, alongside improved engagement metrics like pages-per-visit, time on site and sharing of material discovered through search.  This is an indication that you have helped the search engines better connect the dots between a user query and that user’s intent as it pertains to the product or service you offer.

But I (and I don’t think anyone) has fully cracked this nut yet.  I expect (or at least I hope) we’ll see the better tools developed to help gage the success of search optimization efforts, especially in relation to the type of search results in which a site appears, and how a site is represented in search results.

 

Barry Adams Barry – Yes the loss of exact keyword data is painful, but not insurmountable. We’ve given more prominence in our reporting to keyword ranking reports for the most valuable keywords, making sure we present that data in the right context, and we’ve also begun including the Google Webmaster Tools SEO data in our reports. While that data is far from reliable, over time we feel it gives a strong indication of trends in search visibility.

The real problem lies in accurately reporting traffic growth in search as separate from the effects of above the line advertising done by the client. That’s where we’ll really feel the pain from Google’s withdrawal of the keyword data. I’m not sure we’ll ever be able to crack that nut. In effect it’s become much more important to manage client expectations and clarify the context of our SEO efforts – we no longer have the luxury of letting the data speak for itself, because the data has gone missing.

 

#4 – Google launched the disavow link tool over a year ago, giving SEOs the ability to tell Google to ignore links from unfavorable websites.

What do you think of the concept of this tool? Is it something that most SEOs are underutilizing? Or is it a tool that could have dire consequences in the hands of those who don’t know how to use it?

Jody Nimetz Jody – I think that Google’s need to release such a tool is a result of their struggles to combat link spam.  When they launched their Caffeine update in 2009, their index grew substantially.  With that came a ton of lower quality results that were being found in top search results.  A number of these “spammy” results were a result of pages/sites with artificially inflated link inventories so Google needed a way to control these results from appearing in their Web results.

This is one of the reasons why the whole concept of link popularity is questionable.  It is something that is and has been easily been gamed by spammers and marketers for the past seven or eight years.  The disavow tool is a mechanism that helps Google fight the link spam battle.  As a tool it probably is being underutilized at this point but as more sites continue to be devalued as a result of poor link profiles we will see this tool get more and more used, at least until someone comes back with a better mechanism of dealing with link spam.

For those that do not understand how to use the tool, it could have a negative impact on their online efforts.  As Google states,

This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.”

I think overall that this tool can be very useful when used correctly, but I’d have to agree with Google in that most sites probably do not need to use this tool on a regular basis.

 

Aaron Bradley Aaron – The concept behind this tool is pretty straightforward:  relying too heavily on any single factor in the ranking of web documents eventually undermines that factor’s effectiveness because there’s a demonstrable return on an investment to game that factor.

The disavow tool – by which webmasters are called upon to prove their innocence – signals the beginning of the end of the link economy (as evidenced by Yandex’s recent announcement that, soon,  they will no longer be using links to rank web documents for commercial queries).

Whether or not SEOs are “underutilizing” this tool depends, of course, on the nature of their sites’ inbound linking environment:  if they have a lot of “disreputable” links they should be using the tool liberally, but otherwise ignoring it.  Of course in order to know whether or not to use this tool you have to either be notified that your link environment is problematic (thanks Google) or, more commonly, spend a bunch of money on tools on human labor researching your link environment to determine whether or not it is problematic (thanks Google).

In short, the disavow tool is a necessity because of the corner in which the search engines have painted themselves and – like the increasingly Kafkaesque directives from the search engines on the employment of nofollow – a complete farce.

Barry Adams Barry – I’m not a fan of the disavow tool at all. I see it as a way for Google to crowdsource spam link detection to the SEO community. I would go as far as to say that every SEO that uses the disavow tool is actively harming their own industry and basically being duped in to doing Google’s job for them.

There’s plenty of cases of penalised websites recovering without any use of the disavow tool, and I would urge SEOs to use tactics that avoid disavowing any links. The damage you’ll do with the Disavow tool to SEO in general will come and bite us all in the arse somewhere down the line.
Unfortunately too many SEOs embrace short term thinking to the detriment of long term strategies, so I fear Google has very effectively leveraged many SEO’s inherent uncertainty and lack of confidence to the search engine’s own advantage.

 

#5 – Finally with 2013 drawing to a close, how has your own SEO strategies evolved over the past year? And what was the biggest takeaways and insights you’ve learned so far as an SEO?

Jody Nimetz Jody – One of the most important takeaways or perhaps reminders from this year is realizing that from an organic search perspective, we truly are at the mercy of Google’s algorithms.  You can do everything to the best of your abilities but Google can make a fundamental change that impacts your organic search efforts.

I’ve never been a big fan of the term SEO because what we do has always been more than “search engine optimization”.  It has always been about serving up rich content that will solve a user’s query or question.  It is about getting the right messaging out at the right time to the right audience.  As cliché as that may sound it has been a successful formula that continues to work in organic search.

I think 2013 caused a lot of “SEOs”  to really take a long hard look at the tactics and strategies that they were deploying and forced them to revisit these efforts.  From link building to keyword research to copywriting, Search is in fact becoming more conversational and as a result we need to be flexible enough to adapt to the fact that Google is going to continue to evolve.  The way people interact with Google is changing.  In a few years will we even need to type anything in a Google search bar or are we just going to speak into Google and receive the results we are looking for?  2013 was a very active year for Google, look for more of this in 2014.

 

Aaron Bradley Aaron – Evolution is the right word:  that is, my search strategies changed gradually over course of the year, just as they have in prior years.  Some themes seem to have kept rearing their heads, though:  increased use of structured data markup; more closely marrying search to social (both at the conceptual and structural level); using links more liberally and, at the same time, more previsely, with the aim of helping search engines and other data consumers understand content (rather than simply using links for “improved ranking”).

To this last point, my biggest takeaway was actually something of an epiphany about the nature of semantic search (an observation about the search engines that most readers will probably find too unremarkable for epiphany fodder).  Namely entities in the semantic web are uniquely identified by a (usually HTTP) URI.

While a well-known staple of the semantic web this was a big takeaway for me because I realized this liberates search from keywords, and facilitates the production of everything from rich snippets to instant answers in the Knowledge Graph.  While (despite my deliberately polemical statements to the contrary) keywords will forever play not only an important but intrinsic role in search, it is entities that now rule the roost – and they are only able to do so because of this magnificently simple method of identifying the things and concepts referenced on the web.

 

Barry Adams Barry – Like most SEO and digital marketing agencies, our strategies have evolved to be much more content focused. We now insist on implementing well thought out content strategies for nearly all of our SEO clients. Additionally our linkbuilding has made a profound transition, and we now only engage in manual and fully personalised outreach. While this results in fewer links than traditional volume linkbuilding tactics, the links we do achieve are much more valuable and we feel confident they’ll avoid being downgraded by any future Google penalty algorithmic tweak.

The biggest insight I’ve gained as an SEO to date is that in the end it’s all about people. SEO, as part of digital marketing, is a tactic that connects consumers to brands, and people in different organisations with one another. If you approach SEO as a people-centric endeavour, I think you’ll set yourself up for long term success.
_____
Wow, thank you to Jody, Aaron & Barry for their candor. It’s great to hear the different perspectives from industry veterans and how they practice SEO. Well, that brings 2013 to a close for now. Hope to do another wrap up post again in 2014!

Hey there, I'm Marc!

I'm the Director of Growth Channels @Hootsuite. I've worked in digital marketing since 2000 for agencies, startups and companies like Electronic Arts & SAP. Thanks for checking out my site where I share digital marketing strategies on how to increase website traffic and generate more revenue.

Comments

  1. Nice read Marc, thanks! I’ve got one question: how does today/yesterday’s update to Google Webmaster Tools, providing much more accuracy in Clicks and Impressions, influence the answers of Jody, Aaron and Barry?

    • Marc Bitanga says:

      Hi Steven,

      Yes the new GWT announcement of keyword data availability is welcome news! From some tweets I’ve seen so far, SEOs are indicating that the data may be around 70%-80% accurate. But definitely good news!

Trackbacks

Leave a Reply

Your email address will not be published. Required fields are marked *

*