What's the real impact of machine learning on SEO? This has been one of the biggest debates within SEO over the last year.
I won't lie: I've become a bit obsessed with machine learning. My theory is that RankBrain and/or other machine learning elements within Google's core algorithm are increasingly rewarding pages with high user engagement.
Basically, Google wants to find unicorns – pages that have extraordinary user engagement metrics like organic search click-through rate (CTR), dwell time, bounce rate, and conversion rate – and reward that content with higher organic search rankings.
Happier, more engaged users means better search results, right?
So, essentially, machine learning is Google's Unicorn Detector.
Machine Learning & Click-Through Rate
Many SEO experts and influencers have said that it's totally impossible to find any evidence of Google RankBrain in the wild.
That's ridiculous. You just need to run SEO experiments and be smarter about how you conduct those experiments.
That's why, in the past, I ran an experiment that looked at CTR over time. I was hoping to find evidence of machine learning.
What I found: results that have higher organic search CTRs are getting pushed higher up the SERPs and getting more clicks:
Click-through rate is just one way to see the impact of machine learning algorithms. Today, let's look at another important engagement metric: long clicks, or visits that stay on site for a long time after leaving the SERP.
Time on Site Acts as a Proxy for Long Clicks
Are you not convinced that long clicks impact organic search rankings (whether directly or indirectly)? Well, I've come up with a super easy way that you can prove to yourself that the long click matters – while also revealing the impact of machine learning algorithms.
In today's experiment, we're going to measure time on page. To be clear: time on page isn't the same as dwell time or a long click (or, how long people stay on your website before they hit the back button to return to the search results from which they found you).
We can't measure long clicks or dwell time in Google Analytics. Only Google has access to this data.
Time on page really doesn't matter to us. We're only looking at time on page because it is very likely proportional to those metrics.
Time on Site & Organic Traffic (Before RankBrain)
To get started, go into your analytics account. Pick a time frame before the new algorithms were in play (i.e., 2015).
Segment your content report to view only your organic traffic, and then sort by pageviews. Then you want to run a Comparison Analysis that compares your pageviews to average time on page.
You'll see something like this:
Click image to enlarge
These 32 pages drove our most organic traffic in 2015. Time on site is above average for about two-thirds of these pages, but it's below average for the remaining third.
See all those red arrows? Those are donkeys – pages that were ranking well in organic search, but in all honestly probably had no business ranking well, at least for the search queries that were driving the most traffic. They were out of their league. Time on page was half or a third of the site average.
Time on Site & Organic Traffic (After RankBrain)
Now let's do the same analysis. But we're going to use a more recent time period when we know Google's machine learning algorithms were in use (e.g., the last three or four months).
Do the same comparison analysis. You'll see something like this:
Click image to enlarge
Look at what happens now when we analyze the organic traffic. All but two of our top pages have above average time on page.
This is kind of amazing to see. So what's happening?
Does Longer Dwell Time = Higher Search Rankings?
It seems that Google's machine learning algorithms have seen through all those pages that used to rank well in 2015, but really didn't deserve to be ranking well. And, to me, it certainly looks like Google is rewarding higher dwell time with more prominent search positions.
Google detected most of the donkeys (about 80 percent of them!) and now nearly all the pages with the most organic traffic are time-on-site unicorns.
I won't tell you which pages on the WordStream site those donkeys are, but I will tell you that some of those pages were created simply to bring in traffic (mission: successful), and the alignment with search intent wasn't great. Probably someone created a page that matched the intent better.
In fact, here’s one example: Our AdWords Grader used to rank on page 1 for the query “google adwords” (which has huge search volume – over 300,000 searches a month!). The intent match there is low – most people who search for that phrase are just using it as a navigational keyword, to get to the AdWords site. A small percentage of those searchers might just want to know more about Google AdWords and what it is, kind of using Google as a way to find a Wikipedia page. There’s no indication from the query that they might be looking for a tool to help them diagnose AdWords problems. And guess what? In 2015, the Grader page was one of those top 30 pages, but it had below average time on site.
So at some point, Google tested a different result in place of the Grader – our infographic about how Google AdWords works. It’s still ranking on page 1 for that search query, and it matches the informational intent of the keyword much better.
A Few Caveats on the Data
To be clear, I know these analytics reports don’t directly show a loss in rankings. There are other potential explanations for the differences in the reports – maybe we created lots of new, super-awesome content that’s ranking for higher-volume keywords and they simply displaced the time-on-site “donkeys” from 2015. Further, there are certain types of pages that might have low time on site for a perfectly acceptable reason (for example, they might provide the info the user wants very quickly).
But internally, we know for sure that a few of our pages that had below average time on site have fallen in the rankings (again, at least for certain keywords) in the past couple of years.
And regardless, it’s very compelling to see that the pages that are driving the most organic traffic overall (the WordStream site has thousands of pages) have way above average time on site. It strongly suggests that pages with excellent, unicorn-level engagement metrics are going to be the most valuable to your business overall.
This report also revealed something ridiculously important for us: the pages with below average time on site are our most vulnerable pages in terms of SEO. In other words, those two remaining pages in the second chart above that still have below average time on site are the ones that are most likely to lose organic rankings and traffic in the new machine learning world.
What's so great about this report is you don't have to do a lot of research or hire an SEO to do an extensive audit. Just open up your analytics and look at the data yourself. You should be able to compare changes from a long time ago to recent history (the last three or four months).
What Does It All Mean?
This report is basically your donkey detector. It will show you the content that could be most vulnerable for future incremental traffic and search rankings losses from Google.
That's how machine learning works. Machine learning doesn't eliminate all your traffic overnight (like a Panda or Penguin). It's gradual.
What should you do if you have a lot of donkey content?
Prioritize the pages that are most at risk – those that are below average or near average. If there’s not a really good reason for those pages to have below average time on site, put these at top of your list for rewriting or fixing up so they align better with user intent.
Now go look at your own data and see if you agree that time-on-site plays a role in your organic search rankings.
Don't just take my word for it. Go look run your own reports and let me know what you find.
from Wordstream Blog Feed http://www.wordstream.com/blog/ws/2017/01/25/dwell-time-seo
via IFTTT
No comments:
Post a Comment