July 2013

Do Review Velocity Or Recency Influence Local Business Rankings?

recency of google reviews

I may be getting into a groove here. Wouldn’t you like to know how much Google-based reviews are actually influencing local business rankings? I certainly would, and I’m exploring this the only way I know how: step by step, looking at rankings and poking into the listings to see what I find there. I don’t have an army of team members or robots to fetch data for me, so my studies must perforce be quite limited and small and should be viewed in that light.

My first piece, How Google’s Carousel Convinced Me That Review Counts Count For Nearly Nada, explored the idea that I could find no correlation between the sheer number of reviews a business had earned and its left-to-right ranking in the new carousel. If you read that post, you’ll likely share my puzzlement at seeing businesses with few or zero reviews outranking competitors with over one hundred of them. My conclusion, at that point, was that I needed to drill deeper into other factors – such as the velocity and recency of reviews. That’s will be the focus of this article.

In my earlier piece, I investigated restaurant rankings in a large city, a mid-sized town and a minute hamlet in California. Today, I’m going to revisit the mid-sized town and am choosing the query restaurants san rafael. San Rafael has a population of some 58,000 souls and is located in one of the wealthiest areas of Northern California. It has a very busy restaurant scene and residents are likely to be tech savvy. In other words, this is the kind of town in which you would expect a lot of people to be actively engaged in writing reviews.

Take A Seat At This Table
The following table presents the top 10 restaurants in San Rafael, according to Google’s carousel. Accompanying each business name, you will see a total review count, followed by a count of the reviews earned in each of the first six months of 2013, and finally, a total of the number of reviews the business has earned in the past 6 months. The premise here is that Google is ‘supposed to be’ interested in web-based freshness and activity, and so, one might theorize that a business with the largest volume of most recent reviews would have an advantage over less up-to-the-minute review profiles. Right?

Let’s see…

  Jan Feb Mar Apr May Jun Total
Lotus Cuisine of India
(Total Review Count: 221)
0 1 1 0 2 2 6
Taqueria San Jose
(Total Review Count: 19)
0 1 0 0 0 0 1
Bay Thai Cuisine
(Total Review Count: 6)
0 0 0 0 0 1 1
Il Davide
(Total Review Count: 266)
0 2 0 1 1 1 5
Sol Food
(Total Review Count: 103)
0 1 0 0 2 1 4
My Thai Restaurant
(Total Review Count: 15)
0 1 0 0 0 1 2
San Rafael Joe’s
(Total Review Count: 10)
0 0 0 0 1 0 1
Las Camelias Cocina Mexicana
(Total Review Count: 140)
0 0 0 1 0 0 1
Ristorante La Toscana
(Total Review Count: 11)
0 0 0 0 0 0 0
Sushi To Dai For
(Total Review Count: 26)
0 0 0 0 1 0 1

What I See
At first, looking at the business with that first spot in the carousel, Lotus Cuisine of India, I thought, “By Jove, I think I’ve got it!”. This restaurant has not only earned more reviews over the past 6 months than any of its competitors, but has also earned more reviews in the past month than any other.

Then my shoulders slumped. If I thought this pattern was one that would continue throughout this tiny set of results, I was quickly disabused of my fantasy. As you can see, Taqueria San Jose, sitting at #2, has only had a single review in the last 6 months and that was way back in February. It is outranking competitors with both more total reviews and more recent reviews.

Examine the table for yourself and, if you can see any rhyme or reason in how two businesses with a single review to their name are outranking one with 5 reviews and another with 4, then you’re a better man than I am, Gunga Din.

What Else I See
I was shocked by how few people have used Google to leave a review in the past 6 months for restaurants in this lively, well-to-do urban community. When I actually started digging into the profiles, something quite interesting became evident. For those profiles with 100-200+ reviews, nearly all of them were left more than a year ago.

I have two tentative theories about this:

1) Google’s decision to force users to sign up for a Google+ profile in order to be able to leave a review made a ton of very active reviewers jump ship. Most reviews I saw date to before Google’s decision to do this. Interesting, hmm?

2) Yelp is winning the review battle in Northern California. Let’s look at that top ranked business, Lotus Cuisine of India. It has a near-identical total historic review count on both Yelp (224 reviews) and on Google (221 reviews). But, in the past 6 months, only 6 people have made the effort to review this restaurant on Google, whereas 21 people have reviewed it on Yelp.

Things get really interesting when we look at another business, sitting in position number 5 in Google’s carousel: Sol Food. Sol Food has only received a pitiful 4 reviews on Google in the past half year. I’ve visited San Rafael and there is a literally a line coming out the door of this Puerto Rican phenom of an eatery – it’s that popular. And how many reviews does Sol Food have on Yelp in the past 6 months? 160! Count’em!

So, let’s say we’re seeing hints of a pattern that, at least in San Rafael, people may not be bothering to review businesses on Google, but they are definitely very actively reviewing them on Yelp. *Note, San Rafael is just a stone’s throw away from Yelp headquarters in San Francisco, so results like these may be skewed by the local popularity of this review portal, but in a piece I wrote a few months ago, I noted that I see Yelp results coming up for nearly every local keyword phrase I investigate, everywhere in the US. I’ve been engaged this year in some large studies and see Yelp results and Yelp reviews from sea to shining sea.

So Where Are We At With All This?
My study is so minute. Investigation of a single keyword search reveals only a particle of the big picture, but from this little test, I think I have learned that:

1. I have been unable to explain rank by correlation of sheer review count or recency.
2. Businesses with little or no overall or recent reviews can outrank more actively reviewed businesses.
3. The overwhelming majority of the Google-based reviews I saw were from 1+ years ago. There are tumbleweeds blowing in Google review land.
4. The volume and recency of Google-based reviews are not a good indicator of the actual popularity of a business. Yelp appears to provide a more real-time depiction of restaurant patron experience, at least in Northern California.
5. Despite spending an hour or two digging through reviews again today, I don’t think I’m really any closer to knowing how reviews DO affect rank.

But, I do think I’m getting a sense of how they DON’T.

What do you think? Are there hidden gems in my data table that I’ve overlooked? Does Google look like a ghost town in your town when you search for reviews of popular businesses over the past 6 months? Do you currently employ 10 guys who don’t have enough to do and would like to take my seed of a study and let it bloom across multiple searches to get big data? I welcome your thoughts, your rebuttals and your ideas!

How Google’s Carousel Conviced Me That Review Counts Count For Nearly Nada

Google Carousel Reviews

Working with a will to earn great reviews for your local business?

Believe your company’s high customer satisfaction standards and pro-active review earning policies will be rewarded by Google?

You may be right…but then again, you may not be. Let’s take a look at this.

Whether or not you are fan of Google’s new carousel view for businesses like hotels and restaurants, one thing it has made quite easy for everyone is an at-a-glance assessment of review counts amongst competitors. Since the launch of the carousel, I have read some comments to the effect that this new display is the great leveler – in other words, that there are no longer rankings for these types of businesses.

I can see some sense in that opinion, with every business being displayed on a single horizontal plane. However, I would suggest that in a culture like mine that reads from left to right, what comes furthest to the left automatically seems to indicate priority. That’s where paragraphs start when you read, where newsmen pack their grabbiest words in a headline and where SEOs put their most important elements in a title tag. My brain has been trained to think that most left is most important. This is why you read the words ‘I can see’ at the beginning of this paragraph first instead of starting somewhere in the middle.

Because of this, as a Google user, some part of me assumes that the businesses ordered closest to the left of my screen have somehow been judged by Google to be of more relevance than those further to the right…certainly of more importance than those I have to start scrolling horizontally right in the display bar to view.

And this brings me back to the topic of my piece – the disconnect between left-to-right rankings and review counts. Now, nobody but a few wizards at Google knows the exact amount of influence review count has on overall local business rankings, but from the effort local business owners and Local SEOs have put into the earning of reviews over the past half decade, it’s obvious that most of us think reviews are pretty important.

Imagine my surprise in noting how little review counts seem to matter in regards to where a business is situated in the carousel

I did a bunch of informal searches for ‘restaurants’ in both small and large cities and will share just a few of the results. These results reflect what I saw across the board. I will call the restaurants 1, 2, 3 etc., for the sake of explanation, with 1 being the restaurant appearing most to the left in the carousel display. I will show the first 10 results for each search and will focus on a large city, a medium-sized town and a tiny village in California, none of which are where I am physically located.

Search Term: Restaurants San Francisco

1. 54
2. 761
3. 1795
4. 175
5. 21
6. 2768
7. 827
8. 156
9. 389
10. 1752

Note that: Restaurant #3, with 1,795 reviews is being surpassed by one restaurant with only 54 reviews and another with 761. Also, that restaurant #6 has nearly twice as many reviews as anyone else, but is only 6th in line.

Search Term: Restaurants San Rafael

1. 221
2. 19
3. 6
4. 266
5. 103
6. 15
7. 10
8. 140
9. 26
10. 11

Note that: Why is restaurant #3 with only 6 reviews standing in line in front of competitors with 266, 103 and 140 reviews?

Search Term: Restaurants Boonville

1. 4
2. 0
3. 107
4. 5
5. 0
6. 5
7. 8
8. 27
9. 70
10. 14

Note that: In this tiny village, it certainly looks to me like restaurant #3, with 107 reviews, is the place people go, but it is somehow being cut ahead of in line by an eatery with ZERO reviews and another with only 4. Restaurants 8 and 9 are also way down the line behind several review-less establishments.

The above summary gives just a sampling of this phenomenon. Note that my small study has not taken into account review velocity or recency, but I can’t help thinking that any average user is going to wonder why a business with no reviews is being given precedence over one with 107 of them.

What This Tells Us
Honestly, it doesn’t tell us much about how Google’s algo works for local businesses. But…I would posit that if you are trying to do competitive analysis to appear more towards the left in the carousel, the sheer number of reviews you have isn’t going to influence your hoped-for change of positioning. If you can be outranked by a restaurant with zero reviews, getting 100 of them for your business probably won’t help.

So What Should You Do, Then?
I would suggest looking at other factors like the authority of the website and both the consistency and breadth of citations. Maybe social factors, too? Perhaps locally relevant links or unstructured citations? I purposely didn’t search in my own town because I didn’t want my proximity to any business to influence my results, but that will almost certainly be a factor, too – how close your prospective patron is to your place of business.

Am I Saying To Forget About Reviews?
No, no! Regardless of the influence of review counts on rank, your customers’ glowing reviews of your rosemary polenta and baba ghanoush will do much to bring the hungry public through your doors. I’m just saying that if you’re working yourself into a sweat about being too far to the right of the screen in the carousel, it may be best to invest the bulk of your time in improving metrics other than review counts.

What do you think? Is there a flaw in my logic? Do left-to-right displays make you think left is better? Are you seeing any patterns that explain who is ranking left-most? Would you like to share? I’d love to hear!