If there’s one thing I’ve learned about the smartphone market while working at gap intelligence, it’s that it’s confusing. When Apple recently acknowledged that its older-generation iPhones do, in fact, slow down over time due to battery issues, I was admittedly faced with a personal dilemma… Do I throw down the $29 for a replacement battery and essentially recommit to my dated iPhone 6 for an indefinite amount of time, or do I take the news as a sign that it’s time for me to step up to a newer model?
Mazed and Confused
As I thought about my options, I realized that the latter scenario was absolutely terrifying to me. You see, the last time I upgraded my phone, 2-year contracts were still a thing. I could walk into an AT&T store, pay $199 plus the tax on the phone itself, and walk out with a brand new, latest edition phone in exchange for simply committing to keep the exact terms of my cellular contract intact for another two years, which I probably would have done anyway (especially since I still had a sweet deal that I grandfathered in from the good ol’ Cingular Wireless days). It was beautiful. And better yet, it was easy!
Now, faced with the prospect of potentially upgrading my phone in the post-2-year contract age, I didn’t know what to expect. I knew that I could still walk into an AT&T store, but that now I’d have to choose between a 24-or-30-month option to pay my phone off. Plus, in doing so, I’d also have to switch up my contract terms and give up my beloved, Cingular-era unlimited plan. Of course, I could always pay for the phone outright in order to keep my plan intact… Except, I couldn’t. Because I didn’t exactly have an extra $800 lying around. And all of this was under the assumption that I’d keep the same wireless carrier to begin with. I didn’t even want to imagine the added layer of complexity switching to a new carrier could bring.
Then I started thinking about all the extraneous trade-in, monthly bill credit, BOGO, free accessory, and gift card promo offers on smartphones that I personally track every week at gap intelligence, and all of the different conditions that have to be met for one to even qualify for most of them. It’s enough to make your head spin. Like, what exactly does the term “with qualified activation” entail anyway? Generally speaking, I’m not a fan of fine print, and I don’t know many people who are either.
Needless to say, my fear of the unknown was enough for me to go with option A and fork over the $29 to replace my iPhone 6’s battery. The alternative was just too stressful for me. And furthermore, the nominal improvements to the newer model iPhones didn’t dazzle me enough to justify an upgrade in the first place, but that’s an entirely different story. The point I’m trying to make, as I said at the beginning of this piece, is that the smartphone market is CONFUSING! And buying a new smartphone certainly can be as well.
However, for those who are ready to make the leap towards a new smartphone purchase, there are some ways to seek guidance on their quest without all the extra noise that comes with contracts, carriers, and promotions. Ultimately, some people just want to get down to brass tacks and find out which model of phone is the best or most popular, and one way to do so that is becoming increasingly more popular is by checking out its user reviews. Or, more specifically, its star ratings.
It’s a little known fact that, in addition to all the other product information that gap intelligence scrapes from online retailer websites each week, we’ve actually been capturing star ratings data from BestBuy.com and Amazon for quite some time as well. And although we’re not currently including this data in any of our reporting, it’s something that we could potentially offer at some point in the future. In the meantime, though, it’s kind of fun to play with.
As a consumer who was recently scared-off by the notion of a new smartphone purchase and everything that could possibly come with it, I was curious about how one could try to determine what phone is best strictly from a popularity-based, star rating standpoint. I already had the data at my disposal, which included both the average star ratings and total number of reviews for each product, so all I had to do was a little bit of tinkering to see what findings I could come up with.
It was also important that I set some parameters. Both BestBuy.com and Amazon currently sell several products that have never been reviewed before so, unfortunately, those products had to be excluded from the report. I also realized that sample size was a major factor. I mean, it’s not really fair to declare a particular model of phone as the least popular on an entire website because it was reviewed just once and happened to get a 1-star review. Because of this, I decided only to include products with a minimum of 100 total reviews.
By establishing these two parameters, the remaining lists of products for both websites would only include smartphone models that were well-established enough to have received not only a review, but many of them. The formula wasn’t perfect since many newer-model products would be omitted from the report altogether as a result, but I felt that it was good enough to at least be able to draw some conclusions.
I also had to strip my product lists down considerably so that only one model from each respective product line would be represented. For example, I didn’t think it made sense to compare the number of star ratings (even though they are different) for the AT&T iPhone 7 Plus 128GB model versus the Verizon iPhone 7 Plus 32GB one because, at the end of the day, they’re both iPhone 7 Pluses! Therefore, I scaled the list of product lines down to just one product for each, using a combination of the highest average star rating and number of reviews across ALL products in a given family to determine which individual product listing would represent the product line overall. Yes it sounds confusing, but it actually made the results much less so. And now that it was done, I could finally get in there to see what we found.
And the Results Are In…
First off was BestBuy.com. Now I find it necessary to preface my BestBuy.com results with a bit of a disclaimer… On BestBuy.com, the minimum average star rating for products with 100+ reviews was 3.5, with a maximum average rating of 4.9. Now I don’t know about you, but if a restaurant had a 3.5-star average Yelp rating, it would still probably be enough for me to eat there. The point is, ALL of BestBuy.com’s current smartphones (with 100+ reviews) have favorable user ratings which, if nothing else, may seem at least a little unusual. Now don't get me wrong, I'm certainly not implying that the validity of these ratings is questionable. It's just something worth mentioning.
That said, the top-rated smartphone currently offered at BestBuy.com is none other than… The Huawei Mate 10 Pro! The Mate 10 Pro came in with a whopping 4.9-star average rating after 109 customer reviews, which is quite an amazing feat considering the fact that, at the time of writing, it was only available for pre-order. In fact, recent reports have accused Huawei of actually encouraging people to write positive reviews of the unreleased Mate 10 Pro on BestBuy.com, so I guess here’s your proof! And while one can likely assume that the vast majority of BestBuy.com customer ratings are genuine, it appears as though there may be ways for inauthentic ones to slip through, and may at least partially explain why the reviews are so overwhelmingly favorable on the site as well.
The Mate 10 Pro notwithstanding, the top 20 smartphone product families in terms of average star rating on BestBuy.com broke down as follows, drumroll please:
Interestingly, but not really surprisingly, Samsung and Apple accounted for all of the top 10 smartphone models, and 14 of the top 20 overall. Meanwhile, Google and Motorola made the list 2 times apiece, while smartphones from LG and Huawei each placed once. It's worth noting that Huawei's Mate 9 is a much more well-established (not to mention available) model than the aforementioned Mate 10 Pro, so its ratings can probably be considered far less questionable to boot.
Conversely, the 20 bottom-ranking product families at BestBuy.com broke down like this:
If you’re unlocked smartphone maker BLU, I’ve got just one word for you. Ouch. Their Grand M, Vivo 5 Mini, and Studio XL2 handsets accounted for all of the bottom 3-rated models on BestBuy.com, with the R1 Plus making the bottom 20 as well. And while Samsung’s premium phones had a good showing in the top 20 list, lower-tier models such as the J3 V, Galaxy J1, and Galaxy Sol actually comprised a total of 7 of the bottom 20-rated smartphones. At any rate, the pattern on BestBuy.com seems to indicate that high-end models in general are rated more favorably than budget ones.
Now it was on to Amazon. Like BestBuy.com, Amazon's customer review system may not be able to prevent all bogus ratings. However, per Amazon’s terms regarding prohibited seller activities and actions, sellers may not offer compensation for reviews, nor are they allowed to review their own products or their competitors' products, which in theory should translate to more reliable reviews/star ratings overall. The average star rating of smartphones with 100+ reviews ranges from 2.7 to 4.4 on Amazon, so just that in itself suggests that its ratings are fairly authentic.
But before we jump into the findings, there are a few things I need to say about the online giant as a precursor. First off, gap intelligence does not track smartphones sold by third party merchants within Amazon’s marketplace, so none of those models made the final data set. That said, although many of Amazon’s third party merchants do sell Apple iPhones, Amazon itself does not, so none of those made the final data set either. Lastly, Amazon does tend to hang onto REALLY old smartphone models, oftentimes well past when the rest of the market might deem them discontinued.
Just remember that I told you this when you’re looking over the following list of top 20-rated phones on Amazon:
As you can see, sitting at the top of the list with over 800 reviews and a 4.4-star average rating is Motorola’s 1st-generation Moto E (which was released in May of 2014), followed by Huawei’s Mate 9, which launched in late 2016. Samsung models again made a decent showing, with 6 total placements in the top 20. One of those models, however, was the Galaxy S5, another line that was released all the way back in 2014. And while BLU products rated poorly on BestBuy.com, here we see that both the LTE and non-LTE models of the R2 cracked the top 20 list.
Rounding out the bottom 20 on Amazon were these smartphones:
Yikes, looks like I spoke too soon about BLU, eh? Despite having some favorable ratings in the top 20 list, BLU handsets accounted for 11 of the bottom 20-ranked phones on Amazon, and 4 of the bottom 10 overall, including the worst-rated Advance 4.0 L3. That’s really the big story here, as obscure models such as the Polaroid Link A4, Posh Kick X511a, and CAT S60 all, somewhat predictably, made the list as well, while Samsung’s lone entry in the bottom 20 was the dated, low-tier Galaxy On5.
Now Weight Just a Minute There!
As interesting as it was to review the star ratings findings I came up with and see what products/brands were the most popular in terms of their respective website reviews, I couldn’t help but think that I may be looking at the data somewhat inaccurately. After all, was it fair for me to give a product with a 4.4-star average rating after 531 total reviews a higher overall ranking than a product with a 4.3-star rating but a whopping 663 reviews? Something about it just didn’t feel right, so I opted to write a formula that would calculate each product’s weighted star rating by multiplying its average star rating times the number of reviews the product received, divided by the total number of reviews sitewide at each respective retailer. Or, if visual representations are more your thing, the formula looked like this:
Using this metric, each product’s overall popularity ranking would be calculated by taking into account not only its average star rating and number of reviews, but also the number of reviews relative the total number of reviews posted for all products on that website. The higher the ratio, the higher the weighted star rating.
That said, the top 20 smartphones at BestBuy.com based on the weighted rating star rating system and 39,343 total reviews sitewide were as follows:
While the bottom 20 list looked like this:
As you can see, there are some similarities between these adjusted ratings and the original ones. Smartphones by Samsung and Apple, for example, still dominate the top 20 list, with 11 total placements overall, while BLU handsets and lower-tier Samsung models each make the bottom 20 list multiple times. However, Google’s Pixel 2, which was rated the 14th most popular phone overall in the previous list, is now the 5th least popular! Despite having an average star rating of 4.7, the 114 total reviews it received were not enough to give it as favorable of a standing as the LG G6, for example, which also had a 4.7-star rating but 188 total reviews, placing it at 33rd most popular overall. So needless to say, the weighted star rating formula does make a difference, and certainly makes you look at things from a different perspective.
Looking at Amazon’s weighted ratings, the top 20 list based on 23,379 total reviews sitewide broke down like this:
While Amazon’s bottom 20 list now appeared as so:
Again, there are some parallels that can be drawn between Amazon’s weighted ratings and the previous, non-weighted findings, but also several differences as well. For example, Samsung’s Galaxy S8 and S8+ models now occupy the top 2 spots in terms of popularity, which is probably a more accurate representation than Motorola’s Moto E, which now assumes the 3rd spot. Ironically, BLU smartphones now account for 25% of the top 20 phones as well, although they are still more widely represented on the bottom 20 list, with 8 total placements including the lowest overall ranking with the Advance 4.0 L3 again.
So what does it all mean? One could certainly argue that when you start to see some of the same smartphones on the same types of lists, even if the lists were compiled in completely different ways, there might indeed be something to it. However, I think the results can probably be best described as inconclusive. Yes, star ratings and user reviews are indeed gaining popularity amongst consumers who need help with their buying decisions. And as such, these ratings/reviews are becoming increasingly more important to the manufacturers themselves.
All in all though, I believe a case can be made that when it comes to online star ratings and reviews, you don't know exactly what you’re getting. Even so, here at gap intelligence, we realize the growing significance of these metrics among both the manufacturers and retailers alike, which is why we intially began to tinker with collecting star ratings data in the first place! If you're interested in learning more about our star ratings data capabilities, or have any insights you'd like to share, we'd love to hear from you!
But for me personally, are they helpful? Maybe. Should other shoppers use them to make their own purchase decisions? Possibly so. Will they ever be so convincing that they make me forget about all the other things that stress me out about buying a new phone? Absolutely not.