Local Listings: How Canadian Retailers Stack Up and What You Can Learn

Posted by rMaynes1

Local online listings are an essential component of an effective strategy to drive customers into local stores. Ranking highly in the search engine results and dominating the top rankings with your listings is critical for your customers to be able to find you in an online search.

Partnering with Placeable, Mediative, one of North America’s leading digital marketing and advertising agencies, took twenty-five of Canada’s top retail brands (those with multiple local stores, spread nationally), and analyzed how they’re faring when it comes to local digital marketing compared to their American counterparts. The analysis found that 80% of the online listings for twenty-five of Canada’s top retailers are inconsistent, inaccurate, or missing information. The top twenty-five US retailers are outperforming the top twenty-five Canadian retailers by over 28%.

The retailers’ digital presence was analyzed across four dimensions, and brands received a score from 0 to 100 for each of the four dimensions. The dimension scores were weighted and combined into a single overall score: the NatLo™ Score (“National-to-Local”). This overall score is an authoritative measure of a company’s local digital marketing performance:


Depth and accuracy of published location content, as well as the richness and completeness of site information. Some examples include name, address, phone number, descriptions, services, photos, calls-to-action, and more.

Brands that achieve exceptional depth deliver a better customer experience with richer content about their locations and offerings. Greater Depth also produces higher online to offline conversion rates and supports other marketing calls-to-action.


Website effectiveness in search/discoverability. Some examples include site structure, page optimization, and web and mobile site performance.

Strong visibility produces higher search engine rankings and greater traffic. It also enables brands to achieve multiple listings in search results. Brands with poor Visibility surrender more traffic to directories and competitors.


Data consistency and coverage across third-party sites. Some examples include presence, completeness, and accuracy of location data on third party sites such as Google, Facebook, Factual, Foursquare, and the Yellow Pages directory site (YellowPages.ca).

Brands with outstanding reach can be found by consumers across a range of search engines, social sites and apps. Poor Reach can lead to consumer confusion and misallocated marketing investments.


Geographic accuracy of location data. For example, the pin placement of each location based on latitude and longitude, or the dispersion of pins on third-party sites (pin spread).

Superior precision enables customers to efficiently navigate to a brand’s location. Failure to ensure Precision damages customer trust and increases the risk of competitive poaching.

Key findings of the report

  • Across twenty-five of Canada’s top retailers, on average, 80% of all third-party site listings are inconsistent, inaccurate, or missing information.
  • The top twenty-five US retailers outperformed retailers in Canada by over 28%. Canadian retailers are weak in comparison.
  • In the analysis of twenty-five of Canada’s top retail brands, seven (or 28%) did not have local landing pages on their website, a key component of local SEO strategy that’s needed to rank higher in search engines and get more traffic.
  • Of those that included local landing pages, 72% failed to provide any content over and above the basics of name, address, hours, and phone number on their location pages. Engaging local content on local pages increases a brand’s visibility on the search engine results page, and ultimately drives more traffic to the website and in-store.
  • When it comes to Facebook location data, 90% was either inconsistent or missing, and 75% of Google+ location data was either inconsistent or missing. Inadequate syndication of location data across the third party ecosystem leads to poor placement in search engine results and the loss of online site visits.
  • Only 8% of map pin placements were “good,” with 92% being “fair” or “poor.” Inaccurate pin placements lead to customer frustrations, and a stronger chance customers will visit competitors’ locations.

Canada’s best-performing retail brands

Brands with NatLo™ scores of over 70 have positioned themselves digitally to perform effectively within their local markets, drive consumer awareness, achieve online and mobile visibility, and capture the most web traffic and store visits. Brands achieving a score in the 60s are still doing well and have a good grasp of their local strategy; however, there are still areas that can be improved to maintain competitiveness. Scores below 60 indicate that there are areas in their local strategy that need improvement.

The average score across the Canadian retailers that were analyzed was just under 48—not a single one of the brands was excelling at maintaining accurate and consistent information across multiple channels. Ultimately, the listings accuracy of Canadian brands is weak.

The five top-performing brands analyzed and their corresponding NatLo™ scores were as follows:

Jean Coutu


















Lululemon Athletica






Real Canadian Superstore






Deep dive on depth: Jean Coutu

Jean Coutu performed exceptionally well in the dimension of depth, achieving the highest score across the retailers (80). What does Jean Coutu do to achieve a good depth score?

  1. The brand publishes the name, address, phone number, and hours for each location on its website.
  2. Additional location information is provided, including local flyers and offers, directions, services provided, brands offered, photos, and more (see image below). This delivers a much better customer experience.

Deep dive on visibility: Rona

Rona performed exceptionally well in terms of visibility, with a score of 86.

Rona achieves superior digital presence by using optimized mobile and web locators, plus page optimization tactics such as breadcrumb navigation. The website has a good store locator (as can be seen in the image below), with clean, location-specific urls that are easy for Google to find (e.g. http://www.rona.ca/en/Rona-home-garden-kelowna-kelowna):

In this study of Canadian retailers, many brands struggled with visibility due to the absence of specific, indexible local landing pages and a lack of engaging local content.

Deep dive on reach: Real Canadian Superstore

Real Canadian Superstore performed exceptionally well in the dimension of reach with a score of 72. The brand has claimed all of its Facebook pages, and 69% of location data matches what is listed on their website. Real Canadian Superstore also performed well in terms of its Google+ Local pages, with 49% of location data matching. The brand has a good local presence on Facebook, YellowPages.ca, Factual, Foursquare, and Google+ Local. By claiming these pages, the brand is extending its online reach.

Across all third-party sites measured, Real Canadian Superstore had a 0% location data missing rate on average, compared to the total average across all brands of 20%. Many of the brands struggled with external reach, missing important location information on Facebook and Google+, or having inaccurate information when compared to the same location information on the brand’s website.

Deep dive on precision: Rexall Pharmacy

Interestingly, none of the top-scoring retailers performed very well in terms of precision (accuracy and consistency of pin placements). Rexall Pharmacy was the top performer, with a score of 62. At the time of writing, their precision scores were as follows:

39% were good.
35% were fair.
26% were poor.

In the retail industry, competing businesses can be located very closely to one another, so if the location of your business on a map or the directions provided through the map are accurate, there’s less chance of your customers visiting your competitor’s locations instead of yours.

In conclusion

All in all, Canada’s top retail brands excel at brand advertising at the national level. But when it comes to driving local customers into local stores, a different strategy is required, and not all the top retailers are getting it. A solid local strategy requires all four dimensions to be addressed in order to achieve the synergies of an integrated strategy. By emulating the tactics implemented by the four retailers highlighted above, even the smallest local business can make significant improvements to their local online strategy.

Rebecca Maynes, Manager of Content Marketing and Research with Mediative, was the major contributor on this report. The full report, including which brands in Canada are performing well and which need to make some improvements, is available for free download.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: moz

​Announcing MozCon Local 2016!

Posted by EricaMcGillivray

Looking to level up your local marketing and SEO skills? Join us in Seattle for MozCon Local, Thursday and Friday, February 18-19. With both an all-day conference and a half-day workshop, you’ll hear from top local speakers on topics critical to local marketing, including local link building, app search, mobile optimization, content creation, and so much more. For those who attended LocalUp Advanced last year, this is its newest iteration.

For Friday’s main show, we’ll have a full day of speakers giving in-depth presentations with tons of actionable tips. Whether you’re an in-house or agency marketer, a Yellow Pages publisher, or a consultant, you’ll come away with a long to-do list and new knowledge. Plus, you’ll be able to interact directly with speakers both during Q&A sessions and around the conference, and spend time getting to know your fellow local marketers.

Mary Bowling

We’ve teamed with our friends at Local U again to bring you in-depth workshops on Thursday afternoon. If you have specific questions and needs for your clients or local marketing, they’ll be able to dive into the details, give advice, and address issues unique to your business. For those of you who attended last year’s LocalUp, you’ll remember the great Q&A sessions. Local U is planning a couple different tracks from agency management to recommended tools, which will be a blast.

Buy your MozCon Local 2016 ticket!

Some of our great speakers (more coming!)

Darren Shaw

Darren Shaw

Darren Shaw is the President and Founder of Whitespark, a company that builds software and provides services to help businesses with local search. He’s widely regarded in the local SEO community as an innovator, one whose years of experience working with massive local data sets have given him uncommon insights into the inner workings of the world of citation-building and local search marketing. Darren has been working on the web for over 16 years and loves everything about local SEO.

David Mihm

David Mihm

David Mihm is one of the world’s leading practitioners of Local search engine marketing. He has created and promoted search-friendly websites for clients of all sizes since the early 2000’s. David co-founded GetListed.org, which he sold to Moz in November 2012.

Ed Reese

Ed Reese
Sixth Man Marketing

Ed Reese leads a talented analytics and usability team at his firm Sixth Man Marketing, is a co-founder of LocalU, and an adjunct professor of digital marketing at Gonzaga University. In his free time, he optimizes his foosball and disc golf technique and spends time with his wife and two boys.

Emily Grossman

Emily Grossman

Emily Grossman is a Mobile Marketing Specialist at MobileMoxie, and she has been working with mobile apps since the early days of the app stores in 2010. She specializes in app search marketing, with a focus on strategic deep linking, app indexing, app launch strategy, and app store optimization (ASO).

Lindsay Wassell

Lindsay Wassell

Lindsay Wassell’s been herding bots and wrangling SERPs since 2001. She has a zeal for helping small businesses grow with improved digital presence. Lindsay is the CEO and founder of Keyphraseology.

Mary Bowling

Mary Bowling
Ignitor Digital

Mary Bowling’s been in SEO since 2003 and has specialized in Local SEO since 2006. When she’s not writing about, teaching, consulting and doing internet marketing, you’ll find her rafting, biking, and skiing/snowboarding in the mountains and deserts of Colorado and Utah.

Mike Ramsey

Mike Ramsey
Nifty Marketing

Mike Ramsey is the president of Nifty Marketing and a founding faculty member of Local University. He is a lover of search and social with a heavy focus in local marketing and enjoys the chess game of entrepreneurship and business management. Mike loves to travel and loves his home state of Idaho.

Rand Fishkin

Rand Fishkin

Rand Fishkin uses the ludicrous title, Wizard of Moz. He’s founder and former CEO of Moz, co-author of a pair of books on SEO, and co-founder of Inbound.org.

Robi Ganguly

Robi Ganguly

Robi Ganguly is the co-founder and CEO of Apptentive, the easiest way for every company to communicate with their mobile app customers. A native Seattleite, Robi enjoys building relationships, running, reading, and cooking.

MozCon Local takes place at our headquarters in Seattle, which means you’ll be spending the day in the MozPlex. In addition to all the learning, we’ll be providing great swag. Thursday’s workshops will have a snack break and networking time, and for Friday’s show your ticket includes breakfast, lunch, and two snack breaks. Additionally, on Friday evening, we’ll have a networking party so you meet those attending, some who you may have only met on Twitter. Face-to-face ftw!

We’re expecting around 200 people to join us, including speakers, Mozzers, and Local U staff. LocalUp Advanced sold out last year, and we expect MozCon Local to sell out, too, so you’ll want to buy your ticket now!

Our best early-bird prices:

Ticket Normal price Early-bird price
Friday conference Moz or LocalU subscriber ticket $599 $399
Friday conference GA ticket $899 $699
Thursday workshop Moz or LocalU subscriber ticket $399 $299
Thursday workshop GA ticket $549 $399

In order to attend both the conference and workshop, you must purchase tickets to each. Or you may choose to attend one or the other, depending on your needs.

Buy your MozCon Local 2016 ticket!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: moz

One Size Does Not Fit All: Driving Conversions Through Audience Analysis

Posted by SarahGurbach

“We need more content.”
– Every brand ever, at some point in the history of their company

Having worked as a digital consultant over the past few years, I have been exposed to a good amount of brands in various industries. Some had content teams that consisted of one freelance copywriter, while others had a full-blown crew stocked with designers, videographers, and a slew of writers. Regardless of size, though, when discussing their content needs, there was always one common theme: they thought they needed more of it.

And honestly, my reaction would be something like:

“More content?! Easy! I know just the strategy to get you ranking for all the long-tail keywords surrounding your head term. I’ll do a keyword gap analysis, some competitive research, maybe a little trend reporting and come up with 15–20 content ideas for you to send to your copywriter. We’ll optimize those bad boys with title tags, H1s, and some not-so-secretly hidden CTAs, and we’re done. We’ll rank in the SERPs and get the masses to your site. Oh! And we can share this on social, too.”

Seriously, I won’t lie. That’s what I used to do. But then I got sick of blindly going into these things or trying to find some systematic way of coming up with a content strategy that could be used for any brand, of any size, in any industry, that would appeal to any consumer.

So instead of immediately saying yes, I started asking them “why”… roughly 5 times (h/t Wil Reynolds):

1. Why do you want more content?

“Because I want rankings.” (Well, at least they aren’t trying to hide it.)

2. Why do you want rankings?

“Because I want more traffic.” (Okay, we’re getting there.)

3. Why do you want more traffic?

“Because I want more brand awareness.” (Closer…)

4. Why do you want more brand awareness?

“Because I want people to buy my product.” (Ah, here we go.)

5. Why do you want people to buy your product?

“Because I want money.” (Bingo!)

Suddenly, it’s no longer just “we need more content,” but actually “we need the right kind of content for the right kind of audience at the right time in their journey.” And that may seem leaps and bounds more complicated than their original statement, but we aren’t dealing with the same kind of digital atmosphere anymore—and we sure aren’t dealing with the same consumers. Think With Google’s Customer Path to Purchase perfectly visualizes just how complex our consumers have become online.


And it doesn’t just stop there. At each of these interactions, the consumer will be at a different point in their journey, and they are going to need different content to help build their relationship with your brand. Now more than ever, it is imperative that you understand who your audience is and what is important to them…and then be where they are every step of the way.

Super easy, right? Let’s break it down. Here are some ways you can better understand your audience.

Who is your (right) audience?

“If your content is for everybody, then your content really is for nobody.”
Kristina Halvorson, MozCon 2015

While Kristina’s entire presentation was gold, that was probably my favorite line of this past MozCon. Knowing who your audience is (and who your audience isn’t) is pivotal in creating a successful content strategy. When you’re a brand, you have a tendency to fall into the trap of wanting to make everyone your audience. But you aren’t right for everyone, which is why you have a conversion rate of 0.02%. You don’t need to be the best brand for everyone; you just need to be the best brand for someone…and then see if they have friends.

But I’m not saying you have to go out and do more focus groups, consumer surveys, and personas (although it wouldn’t hurt to do a revamp every now and again). Let’s work with what you’ve got.


As stated before, it’s all about targeting the right audience. Let’s say, in this case, the most important people for my business are those that complete a specific goal. Well, I want to find out everything I can about those people and what is bringing them to my site.

To do this, set up a segment in Google Analytics to see only the traffic that resulted in that goal completion:

  • Add Segment
    • Conditions
      • Find your specific goal
      • Change to > or = 1 per session

From there, you can use the demographics functionality in GA to take a deeper dive into that audience in particular:

You can look at age, gender, location, device, and more. You can even look at their interests:

I would also recommend doing this for particular groups of pages to better understand what kind of content brings in users that will convert. You can create groupings based on the type of content (i.e. help articles, branded content, top-of-the-funnel content, etc.) or you can just look at specific folders.

You can also use this segment to better analyze which sites are sending referral traffic that results in a goal completion, as this would be a strong indicator that those sites are speaking to an audience that is interested in your brand and/or product.

Twitter followers

While analyzing your current followers may only help you understand the audience you already have, it will absolutely help you find trends among people who are interested in your brand and could help you better target future strategies.

Let’s start with Twitter. I am a huge fan of Followerwonk for so many reasons. I use it for everything from audience analysis, to competitor research, to prospecting. But for the sake of better understanding your audience, throw your Twitter handle in and click “analyze their followers.”

Followerwonk will give you a sample size of 5,000, which still gives you a pretty good overview of your followers. However, if you export all of the data, you can analyze up to 100,000 followers. As a cheap beer enthusiast myself, I analyzed people following Rainier beer and was pleasantly surprised to see that I am in good company (hello, marketers).

You can also use Followerwonk to better understand when your audience is most active on Twitter, so you can prioritize when you’ll post the content you crafted specifically for those people when they’re most active.

Additionally, I am a big fan of Followerwonk’s ability to search Twitter bios for specific keywords. Not only is it useful for finding authorities in a specific space, it allows you to find all of the additional words that your audience is using to describe themselves.

Search Twitter bios for a keyword that is important to your business or a word that describes your target audience. Once you do that, export all the bios, throw those bad boys into a word-cloud tool, and see what you get.

Obviously, “cheap beer” leads the way, but look at the other words: craft, wine, whiskey, expensive, connoisseur. Maybe, just maybe, cheap beer enthusiasts also know how to enjoy a fine craft beer every now and then. Would I love to read a cheap beer enthusiast’s guide to inexpensive craft beer? Why, yes, I would. And something tells me that those people on Twitter wouldn’t mind sharing it.

Is this mind blowing? Not necessarily. Does it take 5 minutes, help you better understand your audience, and give you some content ideas? Absolutely.

Facebook fans

Utilize Facebook insights as much as possible for figuring out which audience engages with your posts the most—that’s the audience you want to go after. Facebook defaults to “Your Fans,” but check out the “People Engaged” tab to see active fans.

Simon Penson talked about how you can use Facebook to see if your audience has a greater affinity to a certain product/brand/activity than the rest of their cohort at SearchLove last year, and I highly recommend you play around with that function on Facebook as well.

What do they need?

Internal site search

I like to look at site search data for two reasons: to find out what users are looking for, and to find out what users are having a hard time finding. I’ll elaborate on the latter and then go into detail about the former. If you notice that a lot of users are using internal site search to find content that you already have, chances are that content is not organized in a way that is easy to find. Consider fixing that, if possible.

I usually like to look at a year’s worth of data in GA, so change the dates to the past year and take a look at what is searched for most often on your site. In the example below, this educational client can easily tell that the most important things to their prospective students are tuition prices and the academic calendar. That may not be a surprise, but who knows what gems you may find in your own internal site search? If I were this client, I would definitely be playing into the financial aspects of their school, as it’s proven to be important.


Similar to site search, it’s important to understand what questions your customers have about your product or industry. Being there to answer those questions allows you to be present at the beginning of their path to purchase, while being an authority in the space.

Don’t hit enter

This is an oldie but a serious goodie, and I still use it to this day. Start with the 5 Ws + your head term and see what pops up in Google Autocomplete. This isn’t the end-all be-all, but it’s a good starting point.

Use a handy tool

I haven’t been able to play around with all of Grepwords’ tools and functionalities, but I love the question portion. It basically helps you pull in all of the questions surrounding one keyword and provides the search volume.


This is a fun one. If you know that there are popular forums where people talk about your industry, products, and/or services, you can use advanced search queries to find anyone asking questions about your product or service. Example:

site:stackoverflow.com inurl:”brand name” AND “product name”

You can get super granular and even look for ones that haven’t been answered:

site:stackoverflow.com inurl: “brand name” AND “product name” -inurl:”answer”

From there, you can scrape the results in the SERPs and siphon through the questions to see if there are any trends or issues.

Ask them

And sometimes, if you want to reach the human behind the computer, you have to actually talk to the human. If you are a B2B that has a sales department, have someone on the marketing team sit in on 10–15 of those calls to see if there are any trends in regards to the types of questions they ask or issues they have. If you are a B2C, try offering a small incentive to have your customer take a survey or chat with someone for ten minutes about their experience.

If you are not comfortable reaching out to your current customers, consider utilizing Google Consumer Surveys. After collecting data from GA and other social platforms, you can use that information to hyper-focus your audience segment or create some form of a qualifier question to ensure you are targeting the right audience for your questions.

While Consumer Surveys has its issues, overall it can be a great way to collect data. This is not the platform to ask fifty questions so you can create a buyer persona; instead, pick some questions that are going to help you understand your audience a bit more. Example questions are:

  • Before purchasing [product], what is your research process?
  • Are you active on social? If so, which channels?
  • What prevents you from purchasing a product?
  • What prevents you from purchasing from a specific brand?
  • What are your favorite sites to browse for articles?

Side note: I am also a huge fan of testing potential headlines before publishing content. Obviously, this is not something you will do for every blog post, but if I was Zulily and I was considering posting a major thought leadership piece, I would probably want to set up a 2-question survey:

  • Question #1: Are you a mom?
  • If yes, question #2: Which of these articles looks most interesting to you?

The great thing about that is you only get charged for the 2nd question if they pass the qualifier round.

Give ’em what they want

Now that you have a better understanding of the kind of people you want to target, it’s important that you spend the time creating content that will actually be of value to them. Continuously revisit these research methods as your audience grows and changes.

I rambled on about my favorite techniques, but I would love to hear how you go about better understanding your own audience. Sound off in the comments below, or shoot me a tweet @TheGurbs.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: moz

The Impact of Queries, Long and Short Clicks, and Click Through Rate on Google’s Rankings – Whiteboard Friday

Posted by randfish

Through experimentation and analysis of patents that Google has submitted, we’ve come to know some interesting things about what the engine values. In today’s Whiteboard Friday, Rand covers some of what Google likely learns from certain user behavior, specifically queries, CTR, and long vs. short clicks.

The Impact of Queries Whiteboard

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about click-through rate, queries and clicks, long versus short clicks, and queries themselves in terms of how they impact the ranking.

So recently we’ve had, in the last year, a ton of very interesting experiments, or at least a small handful of very interesting experiments, looking at the impact that could be had by getting together kind of a sample set of searchers and having them perform queries, click on things, click on things and click the back button. These experiments have actually shown results. At least some of them have. Others haven’t. There are interesting dichotomies between the two that I’ll talk about at the end of this.

But because of that and because of some recent patent applications and some research papers that have come to light from the search engines themselves, we started to rethink the way user and usage data are making their way into search engines, and we’re starting to rethink the importance of them. You can see that in the ranking factor survey this year, folks giving user and usage data a higher than ever sentiment around the importance of that information in search rankings.

3 elements that (probably) impact your rankings

So let me talk about three different elements that we are experiencing and that we have been talking about in the SEO world and how they can impact your rankings potentially.

1) Queries

This has to do primarily with a paper that Google wrote or a patent application that they wrote around site quality and the quality of search results. Queries themselves could be used in the search results to say, “Hey, wait a minute, we, Google, see a lot of searches that are combining a brand name with a generic term or phrase, and because we’re seeing that, we might start to associate the generic term with the brand term.”

I’ll give you an example. I’ve done a search here for sushi rice. You can see there’s Alton Brown ranking number one, and then norecipes.com, makemysushi.com, and then Morimoto. Morimoto’s recipe is in Food & Wine. If lots and lots of folks are starting to say like, “Wow, Morimoto sushi rice is just incredible,” and it kind of starts up this movement around, “How do we recreate Morimoto sushi rice,” so many, many people are performing searches specifically for Morimoto sushi rice, not just generic sushi rice, Google might start to see that and say, “You know what? Because I see that hundreds of people a day are searching for this particular brand, the Morimoto sushi rice recipe, maybe I should take the result from Morimoto on foodandwine.com and move that higher up in the rankings than I normally would have them.”

Those queries themselves are impacting the search results for the non-branded version, just the sushi rice version of that query. Google’s written about this. We’re doing some interesting testing around this right now with the IMEC Labs, and maybe I’ll be able to report more soon in the future on the impact of that. Some folks in the SEO space have already reported that they see this impact as their brand grows, and as these brand associations grow, their rankings for the non-branded term rise as well, even if they’re not earning a bunch of links or getting a lot of other ranking signals that you’d normally expect.

2) Clicks and click through rate

So Google might be thinking if there’s a result that’s significantly over-performing its rankings ordinary position performance, so if for example we say, let’s look at the third result. Here’s “How to make perfect sushi rice.”

This is from makemysushi.com. Let’s imagine that the normal in this set of search results that, on average, the position three result gets about 11%, but Google is seeing that these guys makemysushi.com is getting a 25% click-through rate, much higher than their normal 11%. Well, Google might kind of scratch their head and go, “You know what? It seems like whatever the snippet is here or the title, the domain, the meta description, whatever is showing here, is really interesting folks. So perhaps we should rank them higher than they rank today.”

Maybe that the click-through rate is a signal to Google of, “Gosh, people are deeply interested in this. It’s more interesting than the average result of that position. Let’s move them up.” This is something I’ve tested, that IMEC Labs have tested and seen results. At least when it’s done with real searchers and enough of them to have an impact, you can kind of observe this. There was a post on my blog last year, and we did a series of several experiments, several of which have showed results time and time again. That’s a pretty interesting one that click-through rate can be done like that.

3) Long versus short clicks

So this is essentially if searchers are clicking on a particular result, but they’re immediately clicking the back button and going back to the search results and choosing a different result, that could tell the search engine, could tell Google that, “You know, maybe that result is not that great. Maybe searchers are deeply unhappy with that result for whatever reason.”

For example, let’s say Google looked at number two, the norecipes.com, and they looked at number four from Food & Wine, and they said, “Gosh, the number two result has an average time on site of 11 seconds and a bounce back to the SERPs rate of 76%. So 76% of searchers who click on No Recipes from this particular search come back and choose a different result. That’s clearly they’re very disappointed.

But number four, the Food & Wine result, from Morimoto, time on site average is like 2 minutes and 50 seconds. That’s where we see them, and of course they can get this data from places like Chrome. They can get it from Android. They are not necessarily looking at the same numbers that you’re looking at in your Analytics. They’re not taking it from Google Analytics. I believe them when they say that they’re not. But certainly if you look at the terms of use in terms of service for Chrome and Android, they are allowed to collect that data and use it any way they want.

The return to SERPs rate is only 9%. So 91% of the people who are hitting Food & Wine, they’re staying on there. They’re satisfied. They don’t have to search for sushi rice recipes anymore. They’re happy. Well, this tells Google, “Maybe that number two result is not making my searchers happy, and potentially I should rank number four instead.”

There are some important items to consider around all this…

Because if your gears turn the way my gears turned, you’re always thinking like, “Wait a minute. Can’t black hat folks manipulate this stuff? Isn’t this really open to all sorts of noise and problems?” The answer is yeah, it could be. But remember a few things.

First off, gaming this almost never works.

In fact, there is a great study published on Search Engine Land. It was called, I think, something like “Click-through rate is not an organic ranking signal. It doesn’t work.” It talked about a guy who fired up a ton of proxy servers, had them click a bunch of stuff, faking traffic essentially by using bots, and didn’t see any movement at all.

But you compare that to another report that was published on Search Engine Land, again just recently, which replicated the experiment that I and the IMEC Labs folks did using real human beings, and they did see results. The rankings rose rather quickly and kind of stayed there. So real human beings searching, very different story from bots searching.

Look, we remember back in the days when AdWords first came out, when Omniture was there, that Google did spend a ton of time and a lot of work to identify fraudulent types of clicks, fraudulent types of search activity, and they do a great job of limiting that in the AdWords account. I’m sure that they’re doing that on the organic SEO side as well.

So manipulation is going to be very, very tough if not impossible. If you don’t get real searchers and a real pattern that looks like a bunch of people who are logged in, logged out, geographically distributed, distributed by demographic profile, distributed by previous searcher behavior, look like they’re real normal people searching, if you don’t have that kind of a pattern, this stuff is not going to work. Plenty of our experiments didn’t work as well.

What if I make my site better for no gain in rankings?

Even if none of this is a ranking factor. Even if you say to yourself, “You know what? Rand, none of the experiments that you ran or IMEC Labs ran or the Search Engine Land study published, none of them, I don’t believe them. I think they’re all wrong. I find holes in all of them.” Guess what? So what? It doesn’t matter.

Is there any reason that you wouldn’t optimize for a higher click-through rate? Is there any reason you wouldn’t optimize for longer clicks versus shorter clicks? Is there any reason that you wouldn’t optimize to try and get more branded search traffic, people associating your brand with the generic term? No way. You’re going to do this any way. It’s one of those wonderful benefits of doing holistic, broad thinking SEO and broad organic marketing in general that helps you whether you believe these are ranking signals or not, and that’s a great thing.

The experiments have been somewhat inconsistent.

But there are some patterns in them. As we’ve been running these, what we’ve seen is if you get more people searching, you tend to have a much better chance of getting a good result. The test that I ran on Twitter and on social media, that had several thousand people participating, up, up, up, up, rose right up to the top real fast. The ones that only had a few hundred people didn’t seem to move the needle.

Same story with long tail queries versus more head of the demand curve stuff. It’s harder to move more entrenched rankings just like it would be with links. The results tended to last only between a few hours and a few days. I think that makes total sense as well, because after you’ve inflated the click signals or query signals or long click signals or whatever it is with these experimental results, over time those are going to fall away and the norm that existed previously is going to return. So naturally you would expect to see those results return back to what they were prior to the experiments.

So with all that said, I’m looking forward to some great discussion in the Q&A. I know that plenty of you out there have been trying and experimenting on your own with this stuff, and some of you have seen great results from improving your click-through rates, improving your snippets, making your pages better for searchers and keeping them on it longer. I’m sure we’re going to have some interesting discussion about all these types of experiments.

So we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: moz

Case Study: Can You Fake Blog Post Freshness?

Posted by anthonydnelson

Over the years, you’ve certainly read something about how Google loves fresh content. Perhaps you’ve read that sometimes it takes its love of freshness too far.

Now it’s the middle of 2015. Does freshness still play a significant role in how Google ranks search results?

To find out, I decided to conduct a small experiment on a blog. Specifically, I wanted to see if my test could answer the following questions:

  1. If you update a blog post’s date, will it receive a boost in the search engine results pages (SERPs)?
  2. Can you fake freshness?
  3. Do you have to make changes to the content?
  4. If there is a boost present, how long does it last?

Details of the test

  • This test was performed on 16 blog posts on the same site
  • All posts were originally published between September 2010 and March 2014. Each post was at least one year old at the time of this experiment.
  • Each post (except No. 16) received organic traffic throughout 2014, showing an ability to consistently rank in the SERPs
  • URLs for these posts did not change
  • The content was not edited at all
  • The content of focused on evergreen topics (not the type of queries that would be obvious for Query Deserves Freshness (QDF)
  • Only the publishing date was changed. On April 17th, the dates of these posts were set to either April 16th or April 15th, making them all look like they were one to two days old.
  • Each blog post shows the publishing date on-page
  • Posts were not intentionally shared on social media. A few of the more trafficked posts likely received a couple of tweets/likes/pins, but nothing out of the ordinary.
  • Google Search Console, Ahrefs and Open Site Explorer (OSE) did not show any new external links pointed at the posts during the time of testing

Baseline organic traffic

Before starting the test, I took a look at how the test posts were performing in organic search.

The graph below shows the organic traffic received by each of the 16 test posts for the four full weeks (March 15 – April 11) prior to the test beginning.

The important thing to note here is the organic traffic received by each page was relatively static. These posts were not bouncing around, going from 200 visits to 800 visits each week. There is little variation.

baseline organic traffic

The blue line and corresponding number highlights the weekly average for each post, which we will compare to the graph below.

Turning the test on

This one was pretty easy to implement. It took me about 15 minutes to update all of the publishing dates for the blog posts.

All posts were updated on April 17th. I began collecting traffic data again on April 26th, giving Google a week to crawl and process the changes.

Organic traffic after republishing

All 16 posts received a boost in organic traffic.

This graph shows the average organic traffic that each post received for the first four full weeks (April 26 through May 23) after republishing.

organic traffic after republishing blog posts

I expected a lift, but I was surprised at how significant it was.

Look at some of those posts, doubling in average traffic over a one month period. Crazy.

Faking the date on a blog post had a major impact on my traffic levels.

Post No. 16 received a lift as well, but was too small to register on the graph. The traffic numbers for that post were too low to be statistically significant in any way. It was thrown into the test to see if a post with almost no organic traffic could become relevant entirely from freshness alone.

Percentage lift

The graph below shows the percentage lift each post received in organic traffic.

organic lift from updating blog dates

Post No. 14 above actually received a 663% lift, but it skewed the visibility of the chart data so much that I intentionally cut it off.

The 16 posts received 3,601 organic visits in four weeks, beginning March 15 and ending April 11. (That’s an average of 225 organic visits per post, per week.) In the four weeks following republishing, these 16 posts received 6,003 organic visits (an average of 375 organic visits per post, per week).

Overall, there was a 66% lift.

Search impressions (individual post view)

Below you will find a few screenshots from Google Search Console showing the search impressions for a couple of these posts.

Note: Sixteen screenshots seemed like overkill, so here are a few that show a dramatic change. The rest look very similar.

lift in organic impressions

organic impressions

increase in organic impressions

search console organic impressions

What surprised me the most was how quickly their visibility in the SERPs jumped up.

Keyword rankings

It’s safe to assume the lift in search impressions was caused by improved keyword rankings.

I wasn’t tracking rankings for all of the queries these posts were targeting, but I was tracking a few.

keyword ranking after republishing

faking blog date improves ranking

ranking boost from faked freshness

The first two graphs above show a dramatic improvement in rankings, both going from the middle of the second page to the middle of the first page. The third graph appears to show a smaller boost, but moving a post that is stuck around No. 6 up to the No. 2 spot in Google can lead to a large traffic increase.

Organic traffic (individual posts view)

Here is the weekly organic traffic data for four of the posts in this test.

You can see an annotation in each screenshot below on the week each post was republished. You will notice how relatively flat the traffic is prior to the test, followed by an immediate jump in organic traffic.

republished blog posts get temporary increase in traffic

organic traffic lift after republishing blog post

increase in organic traffic

google analytics organic traffic

These only contain one annotation for the sake of this test, but I recommend that you heavily annotate your analytics accounts when you make website changes.

Why does this work?

Did these posts all receive a major traffic boost just from faking the publishing date alone?

  • Better internal linking? Updating a post date brings a post from deep in the archive closer to your blog’s home page. Link equity should flow through to it more easily. While that is certainly true, six of the 16 posts above were linked sitewide from the blog sidebar or top navigation. I wouldn’t expect those posts to see a dramatic lift from moving up in the feed because they were already well linked from the blog’s navigation.
  • Mobilegeddon update? In the Search Console screenshots above, you will see the Mobilegeddon update highlighted just a couple of days after the test began. It is clear that each post jumped dramatically before this update hit. The blog that it was tested on had been responsive for over a year, and no other posts saw a dramatic lift during this time period.
  • Google loves freshness? I certainly think this is still the case. Old posts that rank well appear to see an immediate boost when their publishing date is updated.


Let’s take a second look at the questions I originally hoped this small test would answer:

  1. If you update a blog post’s date, will it receive a boost in the SERPs? Maybe.
  2. Can you fake freshness? Yes.
  3. Do you have to make changes to the content? No.
  4. If there is a boost present, how long does it last? In this case, approximately two months, but you should test!

Should you go update all your post dates?

Go ahead and update a few blog post dates of your own. It’s possible you’ll see a similar lift in the SERPs. Then report back in a few weeks with the results in the comments on this post.

First, though, remember that the posts used in my test were solid posts that already brought in organic traffic. If your post never ranked to begin with, changing the date isn’t going to do much, if anything.

Don’t mistake this as a trick for sustained growth or as a significant finding. This is just a small test I ran to satisfy my curiosity. There are a lot of variables that can influence SEO tests, so be sure to run your own tests. Instead of blinding trusting that what you read about working for others on SEO blogs will work for you, draw your own conclusions from your own data.

For now, though, “fresh” content still wins.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Source: moz