Do Review Velocity Or Recency Influence Local Business Rankings?

recency of google reviews

I may be getting into a groove here. Wouldn’t you like to know how much Google-based reviews are actually influencing local business rankings? I certainly would, and I’m exploring this the only way I know how: step by step, looking at rankings and poking into the listings to see what I find there. I don’t have an army of team members or robots to fetch data for me, so my studies must perforce be quite limited and small and should be viewed in that light.

My first piece, How Google’s Carousel Convinced Me That Review Counts Count For Nearly Nada, explored the idea that I could find no correlation between the sheer number of reviews a business had earned and its left-to-right ranking in the new carousel. If you read that post, you’ll likely share my puzzlement at seeing businesses with few or zero reviews outranking competitors with over one hundred of them. My conclusion, at that point, was that I needed to drill deeper into other factors – such as the velocity and recency of reviews. That’s will be the focus of this article.

In my earlier piece, I investigated restaurant rankings in a large city, a mid-sized town and a minute hamlet in California. Today, I’m going to revisit the mid-sized town and am choosing the query restaurants san rafael. San Rafael has a population of some 58,000 souls and is located in one of the wealthiest areas of Northern California. It has a very busy restaurant scene and residents are likely to be tech savvy. In other words, this is the kind of town in which you would expect a lot of people to be actively engaged in writing reviews.

Take A Seat At This Table
The following table presents the top 10 restaurants in San Rafael, according to Google’s carousel. Accompanying each business name, you will see a total review count, followed by a count of the reviews earned in each of the first six months of 2013, and finally, a total of the number of reviews the business has earned in the past 6 months. The premise here is that Google is ‘supposed to be’ interested in web-based freshness and activity, and so, one might theorize that a business with the largest volume of most recent reviews would have an advantage over less up-to-the-minute review profiles. Right?

Let’s see…

  Jan Feb Mar Apr May Jun Total
Lotus Cuisine of India
(Total Review Count: 221)
0 1 1 0 2 2 6
Taqueria San Jose
(Total Review Count: 19)
0 1 0 0 0 0 1
Bay Thai Cuisine
(Total Review Count: 6)
0 0 0 0 0 1 1
Il Davide
(Total Review Count: 266)
0 2 0 1 1 1 5
Sol Food
(Total Review Count: 103)
0 1 0 0 2 1 4
My Thai Restaurant
(Total Review Count: 15)
0 1 0 0 0 1 2
San Rafael Joe’s
(Total Review Count: 10)
0 0 0 0 1 0 1
Las Camelias Cocina Mexicana
(Total Review Count: 140)
0 0 0 1 0 0 1
Ristorante La Toscana
(Total Review Count: 11)
0 0 0 0 0 0 0
Sushi To Dai For
(Total Review Count: 26)
0 0 0 0 1 0 1

What I See
At first, looking at the business with that first spot in the carousel, Lotus Cuisine of India, I thought, “By Jove, I think I’ve got it!”. This restaurant has not only earned more reviews over the past 6 months than any of its competitors, but has also earned more reviews in the past month than any other.

Then my shoulders slumped. If I thought this pattern was one that would continue throughout this tiny set of results, I was quickly disabused of my fantasy. As you can see, Taqueria San Jose, sitting at #2, has only had a single review in the last 6 months and that was way back in February. It is outranking competitors with both more total reviews and more recent reviews.

Examine the table for yourself and, if you can see any rhyme or reason in how two businesses with a single review to their name are outranking one with 5 reviews and another with 4, then you’re a better man than I am, Gunga Din.

What Else I See
I was shocked by how few people have used Google to leave a review in the past 6 months for restaurants in this lively, well-to-do urban community. When I actually started digging into the profiles, something quite interesting became evident. For those profiles with 100-200+ reviews, nearly all of them were left more than a year ago.

I have two tentative theories about this:

1) Google’s decision to force users to sign up for a Google+ profile in order to be able to leave a review made a ton of very active reviewers jump ship. Most reviews I saw date to before Google’s decision to do this. Interesting, hmm?

2) Yelp is winning the review battle in Northern California. Let’s look at that top ranked business, Lotus Cuisine of India. It has a near-identical total historic review count on both Yelp (224 reviews) and on Google (221 reviews). But, in the past 6 months, only 6 people have made the effort to review this restaurant on Google, whereas 21 people have reviewed it on Yelp.

Things get really interesting when we look at another business, sitting in position number 5 in Google’s carousel: Sol Food. Sol Food has only received a pitiful 4 reviews on Google in the past half year. I’ve visited San Rafael and there is a literally a line coming out the door of this Puerto Rican phenom of an eatery – it’s that popular. And how many reviews does Sol Food have on Yelp in the past 6 months? 160! Count’em!

So, let’s say we’re seeing hints of a pattern that, at least in San Rafael, people may not be bothering to review businesses on Google, but they are definitely very actively reviewing them on Yelp. *Note, San Rafael is just a stone’s throw away from Yelp headquarters in San Francisco, so results like these may be skewed by the local popularity of this review portal, but in a piece I wrote a few months ago, I noted that I see Yelp results coming up for nearly every local keyword phrase I investigate, everywhere in the US. I’ve been engaged this year in some large studies and see Yelp results and Yelp reviews from sea to shining sea.

So Where Are We At With All This?
My study is so minute. Investigation of a single keyword search reveals only a particle of the big picture, but from this little test, I think I have learned that:

1. I have been unable to explain rank by correlation of sheer review count or recency.
2. Businesses with little or no overall or recent reviews can outrank more actively reviewed businesses.
3. The overwhelming majority of the Google-based reviews I saw were from 1+ years ago. There are tumbleweeds blowing in Google review land.
4. The volume and recency of Google-based reviews are not a good indicator of the actual popularity of a business. Yelp appears to provide a more real-time depiction of restaurant patron experience, at least in Northern California.
5. Despite spending an hour or two digging through reviews again today, I don’t think I’m really any closer to knowing how reviews DO affect rank.

But, I do think I’m getting a sense of how they DON’T.

What do you think? Are there hidden gems in my data table that I’ve overlooked? Does Google look like a ghost town in your town when you search for reviews of popular businesses over the past 6 months? Do you currently employ 10 guys who don’t have enough to do and would like to take my seed of a study and let it bloom across multiple searches to get big data? I welcome your thoughts, your rebuttals and your ideas!

14 Responses to “Do Review Velocity Or Recency Influence Local Business Rankings?”

  1. on 06 Jul 2013 at 3:05 pm Phil Rozek

    I do believe you’re on a roll, Miriam!

    This is such a complex question, because (as you well know) there are so many other ranking factors swirling around in the blender that is Google.

    For instance, I’m convinced that the number/quality/diversity of reviews on non-Google sites is an unsung ranking factor. Getting reviews on other sites has been a great boon to my clients, as far as I’ve been able to tell.

    Also, I would think that Google gives at least a little credence to the span of time over which you’ve gotten reviews. If you’ve got Google reviews every year since ’08, I would imagine that that would factor in – and you might even have an advantage over your whipper-snapper competitor across the street who just started getting reviews this year.

    I certainly don’t think you’re missing anything here, but I do think it’s impossible to figure out all the review-related factors that Google may (or may not) use to sort out rankings. For me, there are just so many reasons to try to get reviews – on Google and on other sites – that I see the potential rankings benefits mostly as icing on the cake.

    Thanks for posting!

  2. on 06 Jul 2013 at 3:11 pm Carmen Rane Hudson

    I am absolutely in awe of people like you who can think up and perform these kinds of tests. My first reaction when staring at those Carousel results was, “I have absolutely no idea what the heck is causing these things to show up in order and I think it’s maybe random.”

    But it would seem to make sense that Google’s watching reviews on other sites.

  3. on 06 Jul 2013 at 4:05 pm Linda Buquet

    Great job Miriam, you ARE on a roll!

    Phil I love your Google blender analogy. I think she often blends a smoothie made with everything but the kitchen sink!

    One guy posted in the ranking puzzle section this week a realtor that’s 1) out of business, 2) no URL on the Place page, 3) the web site he does have not resolve 4)NO PHONE on the Place page. Which should mean it’s broken. 5) no reviews 6) PLUS a KW stuffed name that violates the guidelines.

    BUT it ranks #1 in a competitive market. So sometimes the algo is just beyond figuring out.

    Miriam that big post we talked about I’m working on for Tues that kinda helps prove what really controls the local ranking order… My theory does not work for these restaurant results which is odd. It’s like it’s a different algo.

    Totally works and is provable for Drs, Chiros and plumbers. But does not jive for restaurants. Weird!


  4. […] […]

  5. on 06 Jul 2013 at 11:26 pm Nyagolav

    Great research, Miriam. However, I learned the hard way that researching only one factor in isolation is practically impossible. Additionally, as we all know, correlation doesn’t imply causation, so even if one manages to find some correlation between a potential ranking factor and its effect on rankings, this might very well not be the cause, unless of course a really enormous set of data is being researched.

    Here is my theory about how NOTHING affects rankings – quantity. A.k.a. quantity itself is never an important metric. Sheer numbers don’t mean (almost) anything to Google. What matters is not even quality in the general sense we use the word. It is relevance. So in the case of how reviews have effect on rankings, this is what I think:

    1) Reviews are content, and content (especially if it is unique) is good. In some cases, reviews might contain relevant terms, which Google uses when they generate “descriptive terms” for a page. These descriptive terms do have very significant impact on rankings. And the up 5 terms Google displays for any page are just part of all the terms they believe are associated with that page.

    2) Native Google reviews (if more than 10) display an overall score in the total search results, so this might have effect on the click-through rates, which in turn, might be a direct ranking factor.

    3) Being reviewed by an active user of Google+ (or Yelp, or other site) itself brings additional authority to the business and the native search results for the respective website are affected by this. Additionally, it is very possible that an active reviewer on particular platform might also have a lot of acquaintances using this or other platforms. As we know, Google personalizes the search results, so with other things being equal, a business that was reviewed on Google by an active Google+ user would get more visible to more people than a business that was not reviewed by this user or any user at all.

    4) Reviews on other websites might also serve as citations by themselves. They could also serve as “boosters” for the citation on which they are found. Google takes a lot of factors into account when determining the importance/quality of a citation. One of these factors is the completeness of the information found on the citation page. If there are reviews on this page, this increases the likelihood that this citation is authoritative.

    Just my 2 cents here :)

    Again, great research and very important conclusions, Miriam!

    Nyagoslav

  6. on 07 Jul 2013 at 12:00 pm admin

    Hi Phil,

    Like this: “the blender that is Google”.

    And really happy you shared this: “Getting reviews on other sites has been a great boon to my clients, as far as I’ve been able to tell.”

    Do you think it has affected their Google rankings, or is this simply good in its own right? I know I have clients who reap big benefits simply from doing well on Yelp, for example. I don’t know if any of the Local SEOs we all know has ever done any work on whether there is a correlation between 3rd party rankings and an observable ranking boost. I’d like to know what you’ve seen.

    Thanks so much for stopping by! Appreciate your intelligent comments.

    Miriam

  7. on 07 Jul 2013 at 12:02 pm admin

    Hi Carmen,
    Thanks for your kind comment. Yes, for sure, Google is aware enough of other reviews to still be citing them as sources for further data right on local business’ listings. I’m glad you enjoyed this post.
    Miriam

  8. on 07 Jul 2013 at 12:05 pm admin

    Hi Linda!

    “BUT it ranks #1 in a competitive market. So sometimes the algo is just beyond figuring out.”

    And there’s the rub! It is so true that some rankings just seem inexplicable, no matter what metrics one might investigate. It makes research very challenging, because some things can follow a pattern and then Google throws an obsolete business into Phil’s ‘blender’ and you just say, “What???”.

    Can’t wait for that Tuesday post, Linda! I wonder if restaurants behave differently because there is typically such rich data about them vs. many other verticals? Thanks so much for stopping by and for the kind link to this piece from the Local Search Forum.
    Miriam

  9. on 07 Jul 2013 at 12:12 pm admin

    Hi Nyagoslav,
    Fantastic comment! Thank you. I’m glad you think my conclusions make some sense, but I totally get what you’re saying about researching factors in isolation. All of the points you suggest make sense.

    At the same time, what I’m trying to do with these posts is approach this like any local business owner might who is asking himself, “Why does this business rank here? Why is it outranking my business?” That concept, coupled with my resources for research being only one person (me!), is what is making me try to dig into the things Google seems to be highlighting in its display (like review count), one by one, to see if they shed light on who is ranking where.

    I haven’t even looked at any organic factors yet! So, for sure, I’m looking here at the little stuff, but intellectually-speaking, am aware that it isn’t really possible to know the big answer. I really like the points you have made. Fortunately, with everyone in the Local SEO world working on little facets of the big picture over time, we can all benefit by gaining some sort of glimpse of what that picture is.

    So appreciate your detailed comment!
    Miriam

  10. on 07 Jul 2013 at 11:51 pm Phil Rozek

    @Miriam

    I haven’t observed that reviews on third-party sites alone account for this amount or that amount of rankings mojo. As we all know, it’s pretty much impossible to isolate review factors. So the most I can say is that it’s part of my / most of my clients’ processes, that we get good results, and that we seem to get the best results when we can drum up reviews on a variety of sites. But I don’t have any “harder” evidence than that. Wish I did, though!

  11. on 08 Jul 2013 at 11:22 am admin

    Hi Phil!
    Got it. Yes, that would be cool if one could take a brand new business and try to measure that, once 3rd party reviews started coming in, but it would likely be impossible because of all the factors working in conjunction that lead to traffic/phones ringing. Review diversity is good for so many reasons. Sounds like you have happy clients!
    Miriam

  12. on 08 Jul 2013 at 5:46 pm Jim Froling

    Fabulous work, Miriam.

    Given all of the signals at work in Google’s “blender” (love it, Phil!), or algorithm for the scientifically inclined, it is tough to determine the weight of one signal over another.

    Another reason that I’m looking forward to David Mihm’s update on local ranking factors. It should be interesting to compare/contrast this year’s with past years. Certainly, a lot has changed.

    None of us know the RPM’s of “the blender” or even all of the ingredients thrown in. But it is the insights like yours that give us all another puzzle piece that will fit in place somewhere, someday.

    Well done!!

  13. on 08 Jul 2013 at 8:15 pm admin

    Hi Jim!
    Nice to see you here. I can pretty much guarantee that LSRF will be great this year, and as you say, so much has changed since the last edition in 2012, it will be totally fascinating to see how it all adds up. Stay tuned on that, as I know David will be publishing soon. I’m happy you like this post and appreciate you taking the time to comment.

    Miriam

  14. on 16 Jul 2013 at 4:29 am Rebecca Haden

    Last year, we did some testing (on multiple computers in different parts of town, signed in and signed out) and isolated the rankings of one category of local businesses. With multiple trials, we could see that one site most often came up at #1, another came up most often at #2, and another came up most often at #3 for the basic keyword. No other local business in this category came up ahead of these three.

    When the carousel appeared for local search the year, the top three showed left to right, in order.

    I think it’s just Google’s ranking of the sites.

Trackback this Post | Feed on comments to this Post

Leave a Reply