Optimized Design, Local SEO & Copywriting. One-on-one service. All in one place.

visit our blogtinyigloo

Do Review Velocity Or Recency Influence Local Business Rankings?

recency of google reviews
I may be getting into a groove here. Wouldn’t you like to know how much Google-based reviews are actually influencing local business rankings? I certainly would, and I’m exploring this the only way I know how: step by step, looking at rankings and poking into the listings to see what I find there. I don’t have an army of team members or robots to fetch data for me, so my studies must perforce be quite limited and small and should be viewed in that light.
My first piece, How Google’s Carousel Convinced Me That Review Counts Count For Nearly Nada, explored the idea that I could find no correlation between the sheer number of reviews a business had earned and its left-to-right ranking in the new carousel. If you read that post, you’ll likely share my puzzlement at seeing businesses with few or zero reviews outranking competitors with over one hundred of them. My conclusion, at that point, was that I needed to drill deeper into other factors – such as the velocity and recency of reviews. That’s will be the focus of this article.
In my earlier piece, I investigated restaurant rankings in a large city, a mid-sized town and a minute hamlet in California. Today, I’m going to revisit the mid-sized town and am choosing the query restaurants san rafael. San Rafael has a population of some 58,000 souls and is located in one of the wealthiest areas of Northern California. It has a very busy restaurant scene and residents are likely to be tech savvy. In other words, this is the kind of town in which you would expect a lot of people to be actively engaged in writing reviews.
Take A Seat At This Table
The following table presents the top 10 restaurants in San Rafael, according to Google’s carousel. Accompanying each business name, you will see a total review count, followed by a count of the reviews earned in each of the first six months of 2013, and finally, a total of the number of reviews the business has earned in the past 6 months. The premise here is that Google is ‘supposed to be’ interested in web-based freshness and activity, and so, one might theorize that a business with the largest volume of most recent reviews would have an advantage over less up-to-the-minute review profiles. Right?
Let’s see…

  Jan Feb Mar Apr May Jun Total
Lotus Cuisine of India
(Total Review Count: 221)
0 1 1 0 2 2 6
Taqueria San Jose
(Total Review Count: 19)
0 1 0 0 0 0 1
Bay Thai Cuisine
(Total Review Count: 6)
0 0 0 0 0 1 1
Il Davide
(Total Review Count: 266)
0 2 0 1 1 1 5
Sol Food
(Total Review Count: 103)
0 1 0 0 2 1 4
My Thai Restaurant
(Total Review Count: 15)
0 1 0 0 0 1 2
San Rafael Joe’s
(Total Review Count: 10)
0 0 0 0 1 0 1
Las Camelias Cocina Mexicana
(Total Review Count: 140)
0 0 0 1 0 0 1
Ristorante La Toscana
(Total Review Count: 11)
0 0 0 0 0 0 0
Sushi To Dai For
(Total Review Count: 26)
0 0 0 0 1 0 1

What I See
At first, looking at the business with that first spot in the carousel, Lotus Cuisine of India, I thought, “By Jove, I think I’ve got it!”. This restaurant has not only earned more reviews over the past 6 months than any of its competitors, but has also earned more reviews in the past month than any other.
Then my shoulders slumped. If I thought this pattern was one that would continue throughout this tiny set of results, I was quickly disabused of my fantasy. As you can see, Taqueria San Jose, sitting at #2, has only had a single review in the last 6 months and that was way back in February. It is outranking competitors with both more total reviews and more recent reviews.
Examine the table for yourself and, if you can see any rhyme or reason in how two businesses with a single review to their name are outranking one with 5 reviews and another with 4, then you’re a better man than I am, Gunga Din.
What Else I See
I was shocked by how few people have used Google to leave a review in the past 6 months for restaurants in this lively, well-to-do urban community. When I actually started digging into the profiles, something quite interesting became evident. For those profiles with 100-200+ reviews, nearly all of them were left more than a year ago.
I have two tentative theories about this:
1) Google’s decision to force users to sign up for a Google+ profile in order to be able to leave a review made a ton of very active reviewers jump ship. Most reviews I saw date to before Google’s decision to do this. Interesting, hmm?
2) Yelp is winning the review battle in Northern California. Let’s look at that top ranked business, Lotus Cuisine of India. It has a near-identical total historic review count on both Yelp (224 reviews) and on Google (221 reviews). But, in the past 6 months, only 6 people have made the effort to review this restaurant on Google, whereas 21 people have reviewed it on Yelp.
Things get really interesting when we look at another business, sitting in position number 5 in Google’s carousel: Sol Food. Sol Food has only received a pitiful 4 reviews on Google in the past half year. I’ve visited San Rafael and there is a literally a line coming out the door of this Puerto Rican phenom of an eatery – it’s that popular. And how many reviews does Sol Food have on Yelp in the past 6 months? 160! Count’em!
So, let’s say we’re seeing hints of a pattern that, at least in San Rafael, people may not be bothering to review businesses on Google, but they are definitely very actively reviewing them on Yelp. *Note, San Rafael is just a stone’s throw away from Yelp headquarters in San Francisco, so results like these may be skewed by the local popularity of this review portal, but in a piece I wrote a few months ago, I noted that I see Yelp results coming up for nearly every local keyword phrase I investigate, everywhere in the US. I’ve been engaged this year in some large studies and see Yelp results and Yelp reviews from sea to shining sea.
So Where Are We At With All This?
My study is so minute. Investigation of a single keyword search reveals only a particle of the big picture, but from this little test, I think I have learned that:
1. I have been unable to explain rank by correlation of sheer review count or recency.
2. Businesses with little or no overall or recent reviews can outrank more actively reviewed businesses.
3. The overwhelming majority of the Google-based reviews I saw were from 1+ years ago. There are tumbleweeds blowing in Google review land.
4. The volume and recency of Google-based reviews are not a good indicator of the actual popularity of a business. Yelp appears to provide a more real-time depiction of restaurant patron experience, at least in Northern California.
5. Despite spending an hour or two digging through reviews again today, I don’t think I’m really any closer to knowing how reviews DO affect rank.
But, I do think I’m getting a sense of how they DON’T.
What do you think? Are there hidden gems in my data table that I’ve overlooked? Does Google look like a ghost town in your town when you search for reviews of popular businesses over the past 6 months? Do you currently employ 10 guys who don’t have enough to do and would like to take my seed of a study and let it bloom across multiple searches to get big data? I welcome your thoughts, your rebuttals and your ideas!